Who Else Listens To Your TV?

Samsung’s Smart TV listens to everything you say all the time you have voice control enabled. No surprise there. But Samsung’s Terms warn that it’s likely to be sending all that audio to a service provider for analysis, rather than analysing it in your TV.

That’s got plenty of people worried, but Samsung aren’t concerned. They sent me their canned press response, which starts:

Samsung takes consumer privacy very seriously. In all of our Smart TVs, any data gathering or their use is carried out with utmost transparency and we provide meaningful options for consumers to freely choose or to opt out of a service. We employ industry-standard security safeguards and practices, including data encryption, to secure consumers’ personal information and prevent unauthorized collection or use.

I’m sure that is all true. Samsung has a large investment in technical experts of all kinds. All the same, the key phrase there is “prevent unauthorized collection or use”. Why? Well, let’s carry on with their response.

Voice recognition, which allows the user to control the TV using voice commands, is a Samsung Smart TV feature, which can be activated or deactivated by the user. Should consumers enable the voice recognition capability, the voice data consists of TV commands, or search sentences, only. Users can easily recognize if the voice recognition feature is activated because a microphone icon appears on the screen.

That’s not exactly what the Terms say; they note that “if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted”. So we’re not just talking about the sort of data Google Now or Siri sends to their service provider (the phrase after you have started the voice recognition). Samsung also sends the commands themselves, plus any conversation around them. From that description, it seems the whole stream of conversation is likely to be sent.

Samsung does not sell voice data to third parties. If a consumer consents and uses the voice recognition feature, voice data is provided to a third party during a requested voice command search. At that time, the voice data is sent to a server, which searches for the requested content then returns the desired content to the TV.

The fact the data is not sold is good. I would expect no less from Samsung in this circumstance. But there is a use case that is conspicuously excluded from both their statement and the Terms.

What about requests for interception? The data may be encrypted to prevent “unauthorised collection or use” but what about authorised use, when a legal authority in one of the countries involved in the transaction requests access to the raw audio? In the USA, the Third Party Doctrine would allow security and law enforcement services to request access without a warrant. Given the service provider appears to be a US company, even if the customer is in a country where interception locally would be illegal, the NSA (or any of a myriad other US organisations) could still collect on their behalf.

Tim Cushing thinks this is at least gated by the need for the device ID but I think that overlooks the strategy used by the US & UK security services. They separate bulk data collection and later data analysis, treating only the latter as surveillance in need of a warrant. I would not be at all surprised if Samsung’s service providers at some point get an order to tee all their audio inputs through the NSA, using an order of which Samsung may not even be aware. This would not be for immediate analysis, just for pooling and later use once a device ID is obtained by other means.

I asked Samsung to clarify their position on law enforcement use of their streaming audio data, and to clarify whether they had ever received requests for it. So far I’ve had no reply to my questions. I suspect that’s because they have not considered the issue. I think more people need to ask them and their service providers, and their competitors who offer the same services.

You say you have nothing to hide? When a joke you made over dinner is flagged by an algorithm and a clipping provided to a busy police analyst out of context leads to a visit by a SWAT team “just in case”, will you still think that? We need this privacy exposure nipped in the bud, given we have police with a SWAT first and don’t apologise later attitude. Some innocent comment caught by a TV is going to lead to a tragedy otherwise.

Legislating For Unicorns

When Julian Huppert MP (Lib-Dem) asked the Home Secretary Theresa May MP (Con) if banning encryption – as the Prime Minister had been interpreted as saying – is “genuinely what the Home Secretary wants to do?”, she evaded him with her answer.

I remain convinced her and the Cabinet’s position on encryption is based on a non-technical misinterpretation of detailed advice from within the Home Office. Her response, and other responses by her colleagues and by the US government, imply that the security officialdom of the US & UK believes it can resurrect “golden key” encryption where government agencies have a privileged back door into encryption schemes. That’s what’s encoded in her replies as “there should be no safe spaces for terrorists to communicate.” Think “Clipper chip“. As Ryan Paul comments,

More telling though is the insecurity the Conservative Party exhibits on the subject. Unwilling to discuss the matter in a balanced way, party mouthpiece Julian Smith MP descends to ad hominem against deputy Prime Minister Nick Clegg MP (LD), in the process also exhibiting the hypocrisy of the unconvinced apologist. Sadly Mrs May rewards rather than rejects his question.

In a sequence of questions and answers in the same debate – which cannot conceivably have been unplanned – Conservatives ask party-political questions of the Home Secretary, to which she responds with unashamed electioneering. When this tactic is used – accusing an opponent of a fault you exhibit yourself far more than they do – it is always an attempt to conceal your own lack of validity.

Clegg’s crime was to assert that freedom and security are not inherently incompatible:

“I want to keep us safe. It’s ludicrous this idea that people who care about our freedom don’t care about our safety.

“What I will not do, because it is not proven, is say that every single man, woman and child should have data about what they get up to online kept for a year.”

For Conservative MPs to call that “disgraceful” is extremely revealing, both of their lack of comprehension of the issues and the cynicism with which they intend to manipulate the misapprehensions of Middle England for electoral gain. I’ve met no-one who seriously asserts the security services should be unable to secure warranted access to specific communications of those suspected of a crime. That capability is obviously justifiable in a democracy.

But the Communications Data Bill and proposals for “golden keys” go much further than is reasonable and balanced. What defenders of freedom seek is not insecurity; we instead seek transparency, accountability and proportionality, all in a form open to any citizen to scrutinise and challenge.

When Mrs May (and Labour’s Jack Straw MP, and others) refuse that democratic oversight and accuse its proponents of partisanship and irresponsible disregard of security, their own ad hominems and party partisanship reinforce the case rather than diminish it. It’s time for an adult debate informed by technological realities, instead of this opportunism and electioneering.

How To Safeguard Surveillance Laws

This letter was published in the London Evening Standard on January 12th, 2015:

I watch with alarm as, in the wake of the barbaric murders in France, politicians seek increased surveillance powers for the security services.

Surveillance is not always wrong; far from it, our democracy has long allowed accountable public servants to temporarily intrude on individuals they believe to be a threat.

My alarm arises for two reasons:

  • The powers requested in recent attempts at new law are open-ended and ill-defined. They lack meaningful oversight, transparency or accountability. They appear designed to permit the security services free rein in making their own rules and retrospectively justifying their actions.
  • The breadth of data gathered – far beyond the pursuit of individuals – creates a risk of future abuse, by both (inevitable) bad actors and people responding to future moral panic. Today’s justifications – where offered – make no accommodation for these risks.

Voters should listen respectfully but critically to the security services’ requests. Our representatives must ensure that each abridgement of our liberties is ring-fenced:

  • justified objectively using public data,
  • governed with impartial oversight, and
  • guarded by a sunset clause for both the powers and all their data by-products.

If the defence of free speech fatally abrades other liberties we are all diminished.

Yours faithfully

Simon Phipps

Any Revolution Can Be Repurposed

In fact this memorial to one — involving three days of killing in Paris over free speech for the press and a death sentence for blasphemy — has been:

Liberty and Vigilance
The July Column in the Place de la Bastille in Paris – itself dedicated to the celebration of liberty after the French Revolution – was erected in memory of the fallen of the later July Revolution of 1830. It’s not too far from the offices of Charlie Hebdo.

The July Revolution comprised three days of fighting in Paris, primarily on free speech grounds against state censorship. Charles X, France’s last hereditary monarch, had imposed the death penalty for blasphemy against Christianity. He also suspended the liberty of the press and dissolved the newly elected Chamber of Deputies.

Today, the column is used as a platform for surveillance cameras. We must be on our guard against similar repurposing today.

Facebook’s Illuminating Algorithmic Cruelty

The ever-presumptive and unremittingly faux-positive peer pressure of Facebook is doing its part this Christmas to re-open wounds of hurt from 2014 for a bunch of people. Their Year In Review combines algorithmically-selected photographs and text from Facebook postings throughout the year. It was probably conceived in good faith; they clearly anticipated it would promote thankfulness. I think it will be widely regretted rather than welcomed, for the reasons Eric Meyer explains in the moving post from which my title is adapted.

Facebook's assumption of celebration

Frankly my year was not one for balloons

They could definitely have phrased the accompanying text better, not to mention omitted the randomly-selected cover photo – the equivalent Year In Photos at Google+ doesn’t trigger me in the same way, maybe for lack of text. Better, they could have thought through the subject a little more and realised plenty of people, though thankful for so many things, may prefer an algorithm not to force them back through the year. Humans are able to act with discretion, and to know when they are presumptive. Computers are unable to act with any more discretion than their programmer, and usually much less.

My own year has had much that I value, but little of it has been shared with Facebook so my own edition is largely valueless. It also thankfully omits the things that make me cry, like the memory of my mother’s passing this spring or the six months of triage following it. If you’ve chosen to share with Facebook, this is a wake-up call that you have also given them the implicit permission to make you relive memories on command.

Frankly it’s no worse than the other things you’ve given them permission to do with the intimacies you’ve shared. They are just as free with advertisers and social data miners; you just don’t have that rubbed in your face. If you dislike “Year In Review” you probably will hate the things they do with your data without telling you (even if they have secured your permission in advance through their Terms of Service).

In case you were wondering, it’s safe to ignore it; the card displayed on your profile is only visible to you, and as long as you don’t press the “Share” button that appears when you view it, no-one else will see it. You can stop the reminder showing up by clicking the arrow in the top right corner (see below) and telling Facebook not to show the post again. Pity it wasn’t just a button and a “hide this” option for those of us who don’t list Facebook among our confidantes. Algorithms can’t exercise discretion; don’t use them for things that demand it.

Is Santa to blame for the surveillance society?

Perhaps the reason we are not horrified by the surveillance society is because our parents normalised that behaviour by teaching us about Santa.

  • Santa knows if you’ve been naughty or nice
  • Santa knows where you’ve been & who you’ve been with
  • Santa is able to come into your home without apparent consequences
  • There’s even an elf on your shelf keeping an eye on you
  • This is all good because toys

Santa – Ta, NSA.

It’s Not Free If It Cost My Liberty

%d bloggers like this: