"We know that customers have been concerned by recent reports of people listening to audio Siri recordings as part of our Siri quality evaluation process — which we call grading. We heard their concerns, immediately suspended human grading of Siri requests and began a thorough review of our practices and policies," said Apple in today's statement.
Accompanying the apology is a list of changes being implemented by Apple immediately. Most importantly, Apple will no longer keep recordings of Siri interactions with users by default. Instead, users will have the option of letting Apple use their voice data to help inform and improve Siri — with the ability to opt out of this preference whenever. And lastly, for those who do opt in, from now on, only Apple employees (as opposed to contractors) will be allowed to listen to recorded audio.
It's worth noting that, according to Apple, the audio recordings that were subjected to this grading process make up less than 0.2% of all Siri interactions, and that, though Siri requires some user data to execute commands, overall, the user data that Siri interacts with is pretty well-protected as far as voice assistants go. For example, if you ask Siri to read aloud your text messages, Siri will access them without retaining their contents on the Siri server. And after six months, all user personal data — which is tied to a random identifier (basically an encrypted code) instead of the users' Apple ID or phone number — becomes detached from this random identifier and associated with a new one.
Like what you see? How about some more R29 goodness, right here?