Apple contractors hear sensitive Siri recordings, new report reveals

Company accused of being reckless with how it manages sensitive data

Apple is listening to sensitive Siri data

The way tech companies analyze the voice data collected by virtual assistants is becoming a hot button issue. The latest company in the spotlight is Apple, with a new report detailing how contractors often hear sensitive recordings captured by Siri.

A whistleblower speaking to The Guardian revealed that contractors regularly hear sensitive information while analyzing Siri requests. The source also expressed concerns over the frequency of accidental activations that pick up this information.

How to delete your: Alexa voice history | Google Assistant voice history

The fact Apple is using humans to analyze voice recordings isn't news, although there's a valid argument that the company isn't disclosing this process as well as it should. What is news, however, is how it may be handling that data in a reckless way.

According to the source, contractors have heard recordings of private business deals, medical discussions, drug deals, and people having sex – all of which were clearly made unintentionally.

Contractors will note when they believe a recording is made mistakenly, but the fact that these recordings are reaching human ears at all will be enough to concern many users. According to The Guardian's source, the primary culprits for accidentally triggering Siri are the HomePod, Apple Watch wearers raising their wrist (which wakes up Siri), and, oddly, the sound of a zip, which the assistant often confuses for the wake words.

“The regularity of accidental triggers on the watch is incredibly high,” said the source. “The watch can record some snippets that will be 30 seconds – not that long but you can gather a good idea of what’s going on.”

Apple told The Guardian that less than 1% of all recordings taken by Siri end up being listened to by contractors. All recordings are also stripped of their Apple ID, something the company has told us in the past, however the report goes on to mention that recordings are accompanied by "location, contact details, and app data." We've followed up with Apple to elaborate on this.

But even with personal information stripped away, accidental recordings can often include details that could be used to identify the people talking. “Apple is subcontracting out, there’s a high turnover," the source told The Guardian. "It’s not like people are being encouraged to have consideration for people’s privacy, or even consider it. If there were someone with nefarious intentions, it wouldn’t be hard to identify [people on the recordings].”

The source added that there are no specific procedures to deal with sensitive recordings either.

The latest report follows others around how Amazon and Google follow similar procedures. Most recently Google was forced to apologize after more than 1,000 Google Assistant audio recordings were leaked by a contractor, many of which were clearly recorded by mistake. Amazon has also been under fire for not sufficiently anonymizing voice data taken by Alexa.

TAGGED   apple

Recent stories

apple The best Apple HomeKit compatible devices: Lights, plugs, thermostats & more
amazon Smart home privacy: What Amazon, Google and Apple do with your data
apple 93 stupendous Siri Easter eggs: Funny things to ask Apple's assistant

What do you think?

Reply to
Your comment