Amazon employees who are tasked with reviewing Alexa data can access users' locations, including home addresses, according to a new report.
Amazon's not the only one to use humans to boost its AI, but where red flags start being raised - and something this latest report underlines - is in how these conversations can be too easily linked to users' information.
According to Bloomberg's sources, some members of the review team are given latitude and longitude data along with the audio recordings. Bloomberg witnessed a demonstration where this data was put into Google Maps and used to locate the user. The employee reportedly went from listening to the audio file to surfacing the user's address "in less than a minute".
Responding to the story, an Amazon spokesperson told us in a statement: "Access to internal tools is highly controlled, and is only granted to a limited number of employees who require these tools to train and improve the service by processing an extremely small sample of interactions. Our policies strictly prohibit employee access to or use of customer data for any other reason, and we have a zero tolerance policy for abuse of our systems. We regularly audit employee access to internal tools and limit access whenever and wherever possible.‚ÄĚ
Companies like Amazon and Apple use humans to annotate recordings in order to improve the AI, but in the case of Apple, this data is fully anonymized. This is where Amazon is reportedly falling short, by making it too easy to identify users from their recordings. And as highly controlled as Amazon may say they are, there's always the potential for these types of tools to be abused.
Bottom line: Amazon needs to show it's doing better. And according to Bloomberg, the company did in fact limit some employees' access to certain software tools following the previous report detailing the workings of the review team.
If you want to stop sharing your voice recordings with Amazon, here's how to do it.