Apple lets contractors listen to “a small portion of Siri requests” in order to “improve Siri and dictation,” the company acknowledged after a whistleblower working for the tech giant reported that Apple contractors routinely hear highly sensitive information that was accidentally recorded by the digital assistant.
While Siri should only be activated when a user utters the phrase “hey Siri”, the assistant often mistakenly responds to other phrases and sounds, including the sound of a zip. The whistleblower states that because of this, “there have been countless instances of recordings featuring private discussions between doctors and patients, business deals, seemingly criminal dealings, sexual encounters and so on.” While Apple maintains that those recordings “are not associated with the user’s Apple ID,” the whistleblower points out that they constitute a serious privacy violation because they “are accompanied by user data showing location, contact details, and app data.”
Earlier this month, an investigation by Belgian public broadcaster VRT NWS showed that Google also lets human workers listen to audio captured by Google Assistant software. That report similarly found that a significant number of analyzed audio snippets were recorded by accident.
Read more: Apple contractors ‘regularly hear confidential details’ on Siri recordings