Recently the Guardian reported that, thanks to “Siri,” Apple contractors have access to private conversations and personal information. This includes medical information, criminal behavior and even the sound of people have sex.
According to the report:
“Although Apple does not explicitly disclose it in its consumer-facing privacy documentation, a small proportion of Siri recordings are passed on to contractors working for the company around the world. They are tasked with grading the responses on a variety of factors, including whether the activation of the voice assistant was deliberate or accidental, whether the query was something Siri could be expected to help with and whether Siri’s response was appropriate.”
In other words, whether Apple intended it or not, Siri serves as a covert surveillance tool. Apple stated for the record that the private data “is used to help Siri and dictation … understand you better and recognize what you say.” That, of course, doesn’t change the fact that Apple users’ most private information is being recorded and listened to by strangers.
Unsurprisingly, the Guardian story lit up the internet and Apple found itself defending a frankly indefensible policy. Seeing, owing to a significant backlash, that it is indeed indefensible, Apple has reportedly suspended the program.
“Apple says it will review the process that it uses, called grading, to determine whether Siri is hearing queries correctly, or being invoked by mistake,” TechCrunch reports. “In addition, it will be issuing a software update in the future that will let Siri users choose whether they participate in the grading process or not.”
I’m sure this update will transparently spell out what precisely the “grading process” entails so that users can make a truly informed decision. It definitely won’t be misleading or opaque at all, and there definitely won’t be any fine print …
Here’s the standard issue corporate mumbo jumbo from Apple:
“We are committed to delivering a great Siri experience while protecting user privacy. While we conduct a thorough review, we are suspending Siri grading globally. Additionally, as part of a future software update, users will have the ability to choose to participate in grading.”
Talk to Siri at your own risk. Or, you know, use the keyboard.