In the wake of backlash over a Guardian report that exposed employees who were tasked with analysing Siri recordings for accuracy and quality, Apple has announced it is temporarily suspending the programme as it decides how to proceed.
In a statement to TechCrunch, an Apple spokesperson said the company is “committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally.”
Apple added that users will have the ability to choose whether they want to participate in the programme as part of an upcoming software update.
The Siri grading process was exposed in July when one of the contractors contacted The Guardian claiming that they “regularly hear confidential medical information, drug deals, and recordings of couples having sex” as part of their job. Apple explained to The Guardian that the data collected “is used to help Siri and dictation… understand you better and recognize what you say”.
Apple also said the recordings are anonymized and represent less than 1 percent of daily Siri activations. It added that recordings were “not associated with the user’s Apple ID”, though the employee said they “are accompanied by user data showing location, contact details, and app data”.
According to the ‘whistle-blower’, recordings routinely contain snippets of conversations recorded by accidental triggers of the “Hey Siri” wake word. It’s unclear whether these recordings are supposed to be deleted before they reached the employee’s ears. It’s also unknown how long Apple has been running the grading programme.
That’s the right response. Customers should be aware that their Siri recordings may be listened to, and part of Apple’s privacy push should be the ability to keep your data to yourself. We’d also like to see an easier way to see and delete your Siri history, as well as a better way to filter out accidental recordings, but for now, a toggle is a good start.
According to the ‘whistleblower’, recordings routinely contain snippets of conversations recorded by accidental triggers of the “Hey Siri” wake word