The way tech companies analyze the voice data collected by virtual assistants is becoming a hot button issue. The latest company in the spotlight is Apple, with a new report detailing how contractors often hear sensitive recordings captured by Siri.
A whistleblower speaking to The Guardian revealed that contractors regularly hear sensitive information while analyzing Siri requests. The source also expressed concerns over the frequency of accidental activations that pick up this information.
The fact Apple is using humans to analyze voice recordings isn't news, although there's a valid argument that the company isn't disclosing this process as well as it should. What is news, however, is how it may be handling that data in a reckless way.
According to the source, contractors have heard recordings of private business deals, medical discussions, drug deals, and people having sex ‚Äď all of which were clearly made unintentionally.
Contractors will note when they believe a recording is made mistakenly, but the fact that these recordings are reaching human ears at all will be enough to concern many users. According to The Guardian's source, the primary culprits for accidentally triggering Siri are the HomePod, Apple Watch wearers raising their wrist (which wakes up Siri), and, oddly, the sound of a zip, which the assistant often confuses for the wake words.
‚ÄúThe regularity of accidental triggers on the watch is incredibly high,‚ÄĚ said the source. ‚ÄúThe watch can record some snippets that will be 30 seconds ‚Äď not that long but you can gather a good idea of what‚Äôs going on.‚ÄĚ
Apple told The Guardian that less than 1% of all recordings taken by Siri end up being listened to by contractors. All recordings are also stripped of their Apple ID, something the company has told us in the past, however the report goes on to mention that recordings are accompanied by "location, contact details, and app data." We've followed up with Apple to elaborate on this.
But even with personal information stripped away, accidental recordings can often include details that could be used to identify the people talking. ‚ÄúApple is subcontracting out, there‚Äôs a high turnover," the source told The Guardian. "It‚Äôs not like people are being encouraged to have consideration for people‚Äôs privacy, or even consider it. If there were someone with nefarious intentions, it wouldn‚Äôt be hard to identify [people on the recordings].‚ÄĚ
The source added that there are no specific procedures to deal with sensitive recordings either.
The latest report follows others around how Amazon and Google follow similar procedures. Most recently Google was forced to apologize after more than 1,000 Google Assistant audio recordings were leaked by a contractor, many of which were clearly recorded by mistake. Amazon has also been under fire for not sufficiently anonymizing voice data taken by Alexa.