Samsung’s Smart TV listens to everything you say all the time you have voice control enabled. No surprise there. But Samsung’s Terms warn that it’s likely to be sending all that audio to a service provider for analysis, rather than analysing it in your TV.
That’s got plenty of people worried, but Samsung aren’t concerned. They sent me their canned press response, which starts:
Samsung takes consumer privacy very seriously. In all of our Smart TVs, any data gathering or their use is carried out with utmost transparency and we provide meaningful options for consumers to freely choose or to opt out of a service. We employ industry-standard security safeguards and practices, including data encryption, to secure consumers’ personal information and prevent unauthorized collection or use.
I’m sure that is all true. Samsung has a large investment in technical experts of all kinds. All the same, the key phrase there is “prevent unauthorized collection or use”. Why? Well, let’s carry on with their response.
Voice recognition, which allows the user to control the TV using voice commands, is a Samsung Smart TV feature, which can be activated or deactivated by the user. Should consumers enable the voice recognition capability, the voice data consists of TV commands, or search sentences, only. Users can easily recognize if the voice recognition feature is activated because a microphone icon appears on the screen.
That’s not exactly what the Terms say; they note that “if your spoken words include personal or other sensitive information, that information will be among the data captured and transmitted”. So we’re not just talking about the sort of data Google Now or Siri sends to their service provider (the phrase after you have started the voice recognition). Samsung also sends the commands themselves, plus any conversation around them. From that description, it seems the whole stream of conversation is likely to be sent.
Samsung does not sell voice data to third parties. If a consumer consents and uses the voice recognition feature, voice data is provided to a third party during a requested voice command search. At that time, the voice data is sent to a server, which searches for the requested content then returns the desired content to the TV.
The fact the data is not sold is good. I would expect no less from Samsung in this circumstance. But there is a use case that is conspicuously excluded from both their statement and the Terms.
What about requests for interception? The data may be encrypted to prevent “unauthorised collection or use” but what about authorised use, when a legal authority in one of the countries involved in the transaction requests access to the raw audio? In the USA, the Third Party Doctrine would allow security and law enforcement services to request access without a warrant. Given the service provider appears to be a US company, even if the customer is in a country where interception locally would be illegal, the NSA (or any of a myriad other US organisations) could still collect on their behalf.
Tim Cushing thinks this is at least gated by the need for the device ID but I think that overlooks the strategy used by the US & UK security services. They separate bulk data collection and later data analysis, treating only the latter as surveillance in need of a warrant. I would not be at all surprised if Samsung’s service providers at some point get an order to tee all their audio inputs through the NSA, using an order of which Samsung may not even be aware. This would not be for immediate analysis, just for pooling and later use once a device ID is obtained by other means.
I asked Samsung to clarify their position on law enforcement use of their streaming audio data, and to clarify whether they had ever received requests for it. So far I’ve had no reply to my questions. I suspect that’s because they have not considered the issue. I think more people need to ask them and their service providers, and their competitors who offer the same services.
You say you have nothing to hide? When a joke you made over dinner is flagged by an algorithm and a clipping provided to a busy police analyst out of context leads to a visit by a SWAT team “just in case”, will you still think that? We need this privacy exposure nipped in the bud, given we have police with a SWAT first and don’t apologise later attitude. Some innocent comment caught by a TV is going to lead to a tragedy otherwise.