Who Else Listens To Your TV?

Samsung’s Smart TV listens to everything you say all the time you have voice control enabled. No surprise there. But Samsung’s Terms warn that it’s likely to be sending all that audio to a service provider for analysis, rather than analysing it in your TV.

That’s got plenty of people worried, but Samsung aren’t concerned. They sent me their canned press response, which starts:

Samsung takes consumer privacy very seriously. In all of our Smart TVs, any data gathering or their use is carried out with utmost transparency and we provide meaningful options for consumers to freely choose or to opt out of a service. We employ industry-standard security safeguards and practices, including data encryption, to secure consumers’ personal information and prevent unauthorized collection or use.

I’m sure that is all true. Samsung has a large investment in technical experts of all kinds. All the same, the key phrase there is “prevent unauthorized collection or use”. Why? Well, let’s carry on with their response.

Voice recognition, which allows the user to control the TV using voice commands, is a Samsung Smart TV feature, which can be activated or deactivated by the user. Should consumers enable the voice recognition capability, the voice data consists of TV commands, or search sentences, only. Users can easily recognize if the voice recognition feature is activated because a microphone icon appears on the screen.

That’s not exactly what the Terms say; they note that “if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted”. So we’re not just talking about the sort of data Google Now or Siri sends to their service provider (the phrase after you have started the voice recognition). Samsung also sends the commands themselves, plus any conversation around them. From that description, it seems the whole stream of conversation is likely to be sent.

Samsung does not sell voice data to third parties. If a consumer consents and uses the voice recognition feature, voice data is provided to a third party during a requested voice command search. At that time, the voice data is sent to a server, which searches for the requested content then returns the desired content to the TV.

The fact the data is not sold is good. I would expect no less from Samsung in this circumstance. But there is a use case that is conspicuously excluded from both their statement and the Terms.

What about requests for interception? The data may be encrypted to prevent “unauthorised collection or use” but what about authorised use, when a legal authority in one of the countries involved in the transaction requests access to the raw audio? In the USA, the Third Party Doctrine would allow security and law enforcement services to request access without a warrant. Given the service provider appears to be a US company, even if the customer is in a country where interception locally would be illegal, the NSA (or any of a myriad other US organisations) could still collect on their behalf.

Tim Cushing thinks this is at least gated by the need for the device ID but I think that overlooks the strategy used by the US & UK security services. They separate bulk data collection and later data analysis, treating only the latter as surveillance in need of a warrant. I would not be at all surprised if Samsung’s service providers at some point get an order to tee all their audio inputs through the NSA, using an order of which Samsung may not even be aware. This would not be for immediate analysis, just for pooling and later use once a device ID is obtained by other means.

I asked Samsung to clarify their position on law enforcement use of their streaming audio data, and to clarify whether they had ever received requests for it. So far I’ve had no reply to my questions. I suspect that’s because they have not considered the issue. I think more people need to ask them and their service providers, and their competitors who offer the same services.

You say you have nothing to hide? When a joke you made over dinner is flagged by an algorithm and a clipping provided to a busy police analyst out of context leads to a visit by a SWAT team “just in case”, will you still think that? We need this privacy exposure nipped in the bud, given we have police with a SWAT first and don’t apologise later attitude. Some innocent comment caught by a TV is going to lead to a tragedy otherwise.

Legislating For Unicorns

When Julian Huppert MP (Lib-Dem) asked the Home Secretary Theresa May MP (Con) if banning encryption – as the Prime Minister had been interpreted as saying – is “genuinely what the Home Secretary wants to do?”, she evaded him with her answer.

I remain convinced her and the Cabinet’s position on encryption is based on a non-technical misinterpretation of detailed advice from within the Home Office. Her response, and other responses by her colleagues and by the US government, imply that the security officialdom of the US & UK believes it can resurrect “golden key” encryption where government agencies have a privileged back door into encryption schemes. That’s what’s encoded in her replies as “there should be no safe spaces for terrorists to communicate.” Think “Clipper chip“. As Ryan Paul comments,

More telling though is the insecurity the Conservative Party exhibits on the subject. Unwilling to discuss the matter in a balanced way, party mouthpiece Julian Smith MP descends to ad hominem against deputy Prime Minister Nick Clegg MP (LD), in the process also exhibiting the hypocrisy of the unconvinced apologist. Sadly Mrs May rewards rather than rejects his question.

In a sequence of questions and answers in the same debate – which cannot conceivably have been unplanned – Conservatives ask party-political questions of the Home Secretary, to which she responds with unashamed electioneering. When this tactic is used – accusing an opponent of a fault you exhibit yourself far more than they do – it is always an attempt to conceal your own lack of validity.

Clegg’s crime was to assert that freedom and security are not inherently incompatible:

“I want to keep us safe. It’s ludicrous this idea that people who care about our freedom don’t care about our safety.

“What I will not do, because it is not proven, is say that every single man, woman and child should have data about what they get up to online kept for a year.”

For Conservative MPs to call that “disgraceful” is extremely revealing, both of their lack of comprehension of the issues and the cynicism with which they intend to manipulate the misapprehensions of Middle England for electoral gain. I’ve met no-one who seriously asserts the security services should be unable to secure warranted access to specific communications of those suspected of a crime. That capability is obviously justifiable in a democracy.

But the Communications Data Bill and proposals for “golden keys” go much further than is reasonable and balanced. What defenders of freedom seek is not insecurity; we instead seek transparency, accountability and proportionality, all in a form open to any citizen to scrutinise and challenge.

When Mrs May (and Labour’s Jack Straw MP, and others) refuse that democratic oversight and accuse its proponents of partisanship and irresponsible disregard of security, their own ad hominems and party partisanship reinforce the case rather than diminish it. It’s time for an adult debate informed by technological realities, instead of this opportunism and electioneering.

How To Safeguard Surveillance Laws

This letter was published in the London Evening Standard on January 12th, 2015:

I watch with alarm as, in the wake of the barbaric murders in France, politicians seek increased surveillance powers for the security services.

Surveillance is not always wrong; far from it, our democracy has long allowed accountable public servants to temporarily intrude on individuals they believe to be a threat.

My alarm arises for two reasons:

  • The powers requested in recent attempts at new law are open-ended and ill-defined. They lack meaningful oversight, transparency or accountability. They appear designed to permit the security services free rein in making their own rules and retrospectively justifying their actions.
  • The breadth of data gathered – far beyond the pursuit of individuals – creates a risk of future abuse, by both (inevitable) bad actors and people responding to future moral panic. Today’s justifications – where offered – make no accommodation for these risks.

Voters should listen respectfully but critically to the security services’ requests. Our representatives must ensure that each abridgement of our liberties is ring-fenced:

  • justified objectively using public data,
  • governed with impartial oversight, and
  • guarded by a sunset clause for both the powers and all their data by-products.

If the defence of free speech fatally abrades other liberties we are all diminished.

Yours faithfully

Simon Phipps

Any Revolution Can Be Repurposed

In fact this memorial to one — involving three days of killing in Paris over free speech for the press and a death sentence for blasphemy — has been:

Liberty and Vigilance
The July Column in the Place de la Bastille in Paris – itself dedicated to the celebration of liberty after the French Revolution – was erected in memory of the fallen of the later July Revolution of 1830. It’s not too far from the offices of Charlie Hebdo.

The July Revolution comprised three days of fighting in Paris, primarily on free speech grounds against state censorship. Charles X, France’s last hereditary monarch, had imposed the death penalty for blasphemy against Christianity. He also suspended the liberty of the press and dissolved the newly elected Chamber of Deputies.

Today, the column is used as a platform for surveillance cameras. We must be on our guard against similar repurposing today.

Facebook’s Illuminating Algorithmic Cruelty

The ever-presumptive and unremittingly faux-positive peer pressure of Facebook is doing its part this Christmas to re-open wounds of hurt from 2014 for a bunch of people. Their Year In Review combines algorithmically-selected photographs and text from Facebook postings throughout the year. It was probably conceived in good faith; they clearly anticipated it would promote thankfulness. I think it will be widely regretted rather than welcomed, for the reasons Eric Meyer explains in the moving post from which my title is adapted.

Facebook's assumption of celebration

Frankly my year was not one for balloons

They could definitely have phrased the accompanying text better, not to mention omitted the randomly-selected cover photo – the equivalent Year In Photos at Google+ doesn’t trigger me in the same way, maybe for lack of text. Better, they could have thought through the subject a little more and realised plenty of people, though thankful for so many things, may prefer an algorithm not to force them back through the year. Humans are able to act with discretion, and to know when they are presumptive. Computers are unable to act with any more discretion than their programmer, and usually much less.

My own year has had much that I value, but little of it has been shared with Facebook so my own edition is largely valueless. It also thankfully omits the things that make me cry, like the memory of my mother’s passing this spring or the six months of triage following it. If you’ve chosen to share with Facebook, this is a wake-up call that you have also given them the implicit permission to make you relive memories on command.

Frankly it’s no worse than the other things you’ve given them permission to do with the intimacies you’ve shared. They are just as free with advertisers and social data miners; you just don’t have that rubbed in your face. If you dislike “Year In Review” you probably will hate the things they do with your data without telling you (even if they have secured your permission in advance through their Terms of Service).

In case you were wondering, it’s safe to ignore it; the card displayed on your profile is only visible to you, and as long as you don’t press the “Share” button that appears when you view it, no-one else will see it. You can stop the reminder showing up by clicking the arrow in the top right corner (see below) and telling Facebook not to show the post again. Pity it wasn’t just a button and a “hide this” option for those of us who don’t list Facebook among our confidantes. Algorithms can’t exercise discretion; don’t use them for things that demand it.

Is Santa to blame for the surveillance society?

Perhaps the reason we are not horrified by the surveillance society is because our parents normalised that behaviour by teaching us about Santa.

  • Santa knows if you’ve been naughty or nice
  • Santa knows where you’ve been & who you’ve been with
  • Santa is able to come into your home without apparent consequences
  • There’s even an elf on your shelf keeping an eye on you
  • This is all good because toys

Santa – Ta, NSA.

It’s Not Free If It Cost My Liberty

Cause and Effect

Why do people take actions that actually make things worse as they attempt to solve complex problems? I was interviewed Monday for an upcoming documentary called “Orwell Upgraded“. They decided to release one of my answers as a teaser for the documentary – it is the 2 minute summary of my essay Direct & Indirect Causality.

A Database Ripe For Abuse

The draft Communications Data Bill is of great concern, not primarily because it lacks controls over who can access private data – these will be added – but because it creates a privacy-destroying surveillance resource which is certain to be abused in the future – both by government agencies and by illegal intruders. Read more in my article about it on ComputerWorldUK.

☆ Beware The “Super-Public”

As wave after wave of privacy news arrives, it’s easy to believe that public postings on social media sites are the problem. But I believe we are facing an issue caused not by public sharing but by an encounter with a new kind of “public”. First, a short story.

Alice, Bob and Evan

Close Scrutiny

Alice doesn’t mind her photo being visible to everyone on Facebook. She put it there originally because she was flirting with Bob, and the fact everyone else could see it wasn’t an issue. She had spent a lot of time understanding Facebook’s privacy settings in all their labyrinthine splendour and she was pretty sure that the only people who could see personal details about her on Facebook were friends, and the only people who could see the stuff on her Wall were the girlfriends she goes out with when Bob isn’t free, plus Bob (well, for all but one or two things!).

Alice is also a keen Twitter user. She has a different picture there – a flower at the moment, it was a kitten last month – and she’s happy to have a public profile. Her tweets are rarely very personal – just comments on the news, LOLs with the girls, food favourites and a wink to the gallery each time she went out with Bob. She’s been getting into Foursquare lately, checking-in at cinemas, restaurants and bars in a casually competitive way with the girls and with Bob’s mates. She’s often quite high in the league tables and she’s the mayor of the cocktail bar round the corner from her flat.

When she split up with Bob, she actually used all of those social media services more than usual because she hoped the girls – and maybe one or two of Bob’s mates – would rally round to make it hurt a bit less. That was OK for the first week, and she was distracted by fighting for top place in the Foursquare league table with Lavinia. Then one evening she was sitting in the sparsely-populated cocktail bar on her own, feeling depressed and Bob-less. Nursing a glass of the amazing chocolate cocktail that’s not on the menu but which she’s fallen in love with, she’s lost in a miserable dream world when some guy she has never seen there before walks in and sit beside her.

Chocolate MartinisHe made a bee-line for her, as if she had a spotlight bean shining on her, asked if the stool next to her was taken and introduced himself as Evan. That wasn’t something that had ever happened to Alice before – the guys always hit on her friends, never on her. Evan isn’t really her type; he’s probably a few years younger than her and has the air of an extra from The Big Bang Theory. Alice is quite surprised when he asks the waiter for a glass of the same cocktail – by name. She’d assumed it was just the regulars who knew about it, and there was no way this Evan was a regular at the bar. She’s even more surprised when Evan strikes up a conversation with her.

As they start to exchange trivia, Alice discovers Evan has seen almost every movie she’s been to in the last two months. What’s more, he liked all the same ones as her and hated all the ones she hated. His taste in books is also excellent. She’s starting to wonder if she’s been missing out by her antipathy for geeks.  Finally the last olive is eaten and Evan suggests they go grab a meal, her guard is down. So when he proposes dinner at the Greek place Bob used to take her to, she had no defenses left.

Public and Super-Public

Spontaneous Gathering of MonarchsI’m no novelist (all those names are borrowed from security theory) and I don’t know how this story ends. But I do know Evan’s secret. He was exploiting a new kind of “public” using an iPhone app called “Girls Around Me”, which aggregated together information from all the social media tools Alice was using and gave him the ability to eavesdrop on her activities. Alice had a reasonable expectation that all her public activities would be seen by all her friends, and no particular concerns that any of them might be seen by strangers. She was engaging in what researcher Danica Radovanovic has called “phatic posts”, providing public context to her life with what seemed trivial information in the same way as a group of friends in the real world might do.

In the real world, “public” is accompanied by practical realities that introduce a little friction. To listen to Alice and Bob in the  bar, Evan would need to sit close enough to hear them, and they’d probably notice and change their discussion. To see all the places Alice went and the things she likes, he would need to take the time to follow her covertly. His actions would quickly be apparent as obsessive and problematic – Rick Falkvinge explains this more.

But in the new “super-public” of data-mined social media, the friction is gone, and the sort of information Evan used to find and meet Alice was simply the product of triangulation between her posts. The software he downloaded for his iPhone did it all for him, although he is probably enough of a geek to stitch together scripts that would harvest the JSON from dozens of REST interfaces and use open source business intelligence tools to mine the resulting data. It’s unlikely any privacy rules would even be implicated, let alone broken

That’s the issue here. Sharing information in phatic posts is normal and expected – it’s just the translation of life in atom-space into life in bit-space. What’s new is the super-public, the exposure of life to scrutiny by triangulation and data-mining. So far, no privacy legislation takes it into proper account. Companies, however are now actively mining the super-public.

Discussion of privacy treats it as a bilateral matter between the subject of the data and the application provider, focusing on “do not track” and application privacy settings. While this is important (as the scandal of Facebook’s Social Reader shows). we need to move to a place – as Helen Nissenbaum has explained – where we see privacy as a matter of control of the flow of information across contexts. We need to discuss and legislate for the super-public.

(First published in ComputerWorldUK on April 10, 2012)

%d bloggers like this: