To help us provide you with free impartial advice, we may earn a commission if you buy through links on our site. Learn more

Apple contractors “regularly hear confidential info, drug deals and couples having sex” via Siri, claims whistleblower

An Apple contractor claims to have blown the whistle on privacy flaws with the reviewing of Siri recordings

A whistleblower has claimed that contractors working with Apple can listen in on private conversations via Siri.

Speaking to The Guardian, the unnamed source works as an Apple contractor and wants to remain anonymous to protect their job.

They revealed that not only do third-party contractors “regularly hear confidential medical information, drug deals, and recordings of couples having sex”, these recordings are linked to information that could be used to identify the user, including location and contact details.

If these claims are true – and as yet they have been unsubstantiated – they appear to jar slightly with what’s written in Apple’s privacy policy. Apple says that to help Siri recognise a person’s pronunciation, and give better responses, certain information such as their name, contacts, music they listen to and previous searches are sent to Apple servers but they’re protected “using encrypted protocols.” This means Siri does not associate this information with a person’s Apple ID.

Every day, a sample of recordings are analysed by Apple contractors as part of a process called “grading”. This involves contractors listening to the recordings, and Siri’s response, before grading the voice assistant for accuracy and relevancy. This is designed to help Apple’s Siri team improve future responses.

In a statement, Apple insists that only a small portion of Siri requests are reviewed and each of the responses are “analysed in secure facilities and all reviewers are under the obligation to adhere to Apple’s strict confidentiality requirements.” The company told The Guardian that a “very small random subset, less than 1% of daily Siri activations, are used for grading, and those used are typically only a few seconds long.”

The whistleblower voiced concerns that some of these activations are accidental, meaning Siri contractors are inadvertently hearing private and intimate conversations that shouldn’t have been recorded. They described “countless instances” of conversations between doctors and patients or sexual encounters and each recording “shows user data on their location and contact details.”