Apple suspends program that used recordings of customers' sexcapades and medical secrets

23comments
Apple suspends program that used recordings of customers' sexcapades and medical secrets
Last week, we told you that Apple, like Google and Amazon, has a third party firm listening to clips recorded by its virtual digital assistant. Apple and the other companies say that this is necessary in order to improve the user experience of their AI-driven helpers. One whistleblower who works for the contractor employed by Apple related how private medical information is sometimes heard on these Siri snippets and occasionally the contractors are titillated by the sounds of two (or more) people engaging in sexual activity. In such situations, Siri is activated by mistake. The whistleblower pointed out that sometimes the sound of a zipper will act like the wake word, which might explain the recordings of users' intimate moments.

This all seemed to contradict Apple's claims about how the company protects the privacy of iPhone users. In fact, you might recall that back when this year was just five days old, Apple paid for a giant billboard that looked out over the streets of Las Vegas in the vicinity of the Las Vegas Convention Center. At the time, the venue was hosting the Consumer Electronics Show and Apple was hoping to attract the attention of those passing by on the way to the event. The sign, borrowing from the city's own iconic promotional slogan, read "What happens on your iPhone, stays on your iPhone." Except that turned out not to be entirely true.

A future iOS update will allow users to opt-out of Siri's grading process


Well, it seems that the blowback from the report about consultants listening to snippets of recordings from Siri touched a raw nerve at Apple. Reuters reports that this morning, Apple has decided to suspend the global program that graded Siri's performance. Apple has said that one of the reasons why these recordings were being looked at was to see what sounds accidentally awaken the virtual digital assistant. The company previously stated that "A small portion of Siri requests are analyzed to improve Siri and dictation. User requests are not associated with the user’s Apple ID. Siri responses are analyzed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements."


An Apple spokeswoman said this morning "While we conduct a thorough review, we are suspending Siri grading globally." She also noted that in a future iOS update, users will be able to opt-out of the program at their discretion.

Last April, Amazon was criticized when word leaked out about the teams it employs to transcribe users' conversations with Alexa. Amazon claimed at the time that the program is designed to help Alexa improve its understanding of the human language. Each employee, stationed in Boston, Costa Rica, India and Romania, puts in a nine-hour workday during which he or she might listen to 1,000 audio clips taken from Alexa. And Amazon has added an internal system that allows members of these teams to share recordings with one another. While that is supposedly done to help team members figure out hard to understand words or phrases, we can imagine it being used to share the most embarrassing comments and lewd remarks made by Echo owners. These are recorded by Alexa without the user's knowledge.

Recommended Stories
Even though the programs run by Apple, Google and Amazon are supposedly done to improve the performance of Siri, Google Assistant, and Alexa respectively, it is easy to see how consumers could be concerned. If a virtual digital assistant is summoned by accident, users might never know when something they are saying that is supposed to be private, is being heard and shared by members of a team sitting in an office somewhere far away.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless