Bug allowed some iPhone users to have Siri interactions shared with Apple despite opting out
Back in the summer of 2019, both Amazon, Google, and Apple were hit with charges claiming that their Alexa, Google Assistant, and Siri digital assistants were recording personal conversations and having them transcribed by third-party firms. All three companies admitted that these recordings were being made to improve the performance of the digital assistants. In Siri's case, some of the recordings collected by Apple included couples having sex, or speaking frankly with their Doctors about private medical problems.
At the time, both Amazon and Google allowed users to opt out while Apple didn't. But these days, customers of all three firms can opt out of having their convos used to help train Alexa, Google Assistant, or Siri.
Some iPhone users who opted out of the program to use their interactions with Siri were somehow opted-in
However, the release of iOS 15 beta 2 fixes a bug that ZDNet says might have been used to record some iPhone users' interactions with Siri. Once you update your iPhone to iOS 15.4 (it's still currently in beta), you will be asked if you want to help improve Siri and dictation by allowing Apple to review recordings of your interactions with the digital assistant. Opting out would prevent your voice interactions with Siri and the iPhone's voice-dictation feature from being recorded and sent to Apple.
A Whistleblower revealed that one of the devices that captured users' conversations was the HomePod
But as it turns out, a bug found in iOS 15 enabled the feature even for those who opted-out. So while you might have thought that your flirting with Siri was between you and the digital assistant, Apple was still able to listen to some users' private or even X-rated conversations.
In iOS 15.2, after Apple discovered the bug, it disabled the setting that allowed Apple to make these recordings and also got rid of the bug that automatically allowed the recordings to be made even when the user opted out. Commenting on the situation, Apple said, "With iOS 15.2, we turned off the Improve Siri & Dictation setting for many Siri users while we fixed a bug introduced with iOS 15."
Apple added, "This bug inadvertently enabled the setting for a small portion of devices. Since identifying the bug, we stopped reviewing and are deleting audio received from all affected devices."
When stories about Apple keeping recordings of customers' interactions with Siri first made the rounds over two years ago, Apple said less than 1% of Siri activations were passed along to third-party contractors whose job it was to determine whether the assistant was activated by the user, or was accidentally summoned. Siri was also graded on whether it responded appropriately to the user's query.
A whistleblower said Siri would activate with the sound of a zipper
The initial reports cited information from a whistleblower who worked for one of the contractors hired by Apple. The source said that Siri would sometimes mistakenly activate because the assistant would think that the "Hey Siri" wake word was said when in reality it was not. In a bizarre admission, the whistleblower said that the sound of a zipper would sometimes activate Siri.
The sound of a zipper would supposedly activare Siri
The anonymous whistleblower pointed out that most false Siri activations came from the Apple Watch and the HomePod smart speaker. He also left a quote that gives us a good idea about some of the content that the contractors were listening to while trying to grade Siri.
"The regularity of accidental triggers on the watch is incredibly high. The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on...you can definitely hear a doctor and patient, talking about the medical history of the patient," said the whistleblower.
"Or you’d hear someone, maybe with car engine background noise – you can’t say definitely, but it’s a drug deal … you can definitely hear it happening. And you’d hear, like, people engaging in sexual acts that are accidentally recorded on the pod or the watch."
Hopefully, with the bug squashed in iOS 15.2, when iOS 15.4 comes along and you're asked whether to opt in or opt out of the program to improve Siri (using your personal conversations), your toggle switch remains on disabled if that is indeed the option you choose.
Things that are NOT allowed: