Apple users' most intimate and confidential moments are relayed by Siri to contractors
According to a report published by The Guardian, Apple's digital assistant Siri sends personal recordings to contractors. Similar to what Amazon does with its virtual assistant Alexa to improve its understanding of human language, these contractors grade Siri's performance. But some of the recordings that were captured by Siri and sent to be graded included couples having sex and others contained personal and confidential conversations about medical information. Besides Apple and Amazon, Google also does something similar with Google Assistant. But while Amazon and Google allow their users to opt-out of some uses of the recordings they generate, Apple doesn't.
The Apple Watch and the HomePod are the sources of most accidental activations of Siri
Apple says less than 1% of the daily activations of Siri are passed along to the contractors who try to determine whether the assistant was activated on purpose or by accident. Siri is also graded on whether it was able to respond to the user's request or query, and whether that response from Siri was appropriate. Apple says that each snippet of audio from Siri graded by these third party firms runs for only a few seconds. It notes that "A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements."
A whistleblower says that most accidental Siri activations take place on the Apple Watch and the HomePod
The Guardian's report cites a whistleblower working for one of the contractors who says that in too many cases, users are accidentally activating Siri allowing the contractors to hear sensitive personal information or activities. Besides mistaking certain words for the "Hey Siri" phrase that awakens the virtual helper, the whistleblower notes that sometimes the sound of a zipper will activate Siri. He also says that the most mistaken activations occur on the Apple Watch and the HomePod smart speaker. When the smartwatch is raised up and hears speech, it will automatically activate Siri. With 35% of the global smartwatch market, the Apple Watch is on a lot of wrists and is generating plenty of Siri-based content for the contractors to go through.
"The regularity of accidental triggers on the watch is incredibly high. The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on...you can definitely hear a doctor and patient, talking about the medical history of the patient. Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch."-Anonymous whistlenblower
The whistleblower reveals that those working on grading Siri have quotas that must be met, so the goal is to go through the recordings as fast as possible. He also says that it could be possible to identify those whose voices can be heard on the Siri samples. "There’s not much vetting of who works there, and the amount of data that we’re free to look through seems quite broad. It wouldn’t be difficult to identify the person that you’re listening to, especially with accidental triggers – addresses, names and so on," he warns.
The company values its reputation when it comes to customer privacy and while it can hide behind the claims that what it is doing will improve the experience of using Siri, for some Apple customers it might come at too high a cost.
Things that are NOT allowed: