Alexa recorded a couple's private conversation and sent it to a contact

Alexa's most human feature yet: Betrayal

Alexa recorded a couple's private convo
The Ambient is reader-powered. If you click through using links on the site, we may earn an affiliate commission. Learn more

As voice assistants begin to make it into our homes, constantly monitoring chatter for a wake work so that they can act on your simple demands, you may have a little voice in the back of your head wondering if they were ever listening to your private conversations.

For a couple in Portland, that became reality. A woman named Danielle told KIRO 7 that Amazon's Alexa recorded a private conversation between her and her husband and sent it to a contact as an audio message.

Essential reading: How to delete Alexa's voice history

Danielle's home is outfitted with smart technology, and features an Alexa-enabled product in each room to control the temperature, lights and security system. One day her husband's employee contacted him and told them to "unplug your Alexa devices right now."

"We unplugged all of them and he proceeded to tell us that he had received audio files of recordings from inside our house," she said. "At first, my husband was, like, 'no you didn't!' And [the recipient of the message] said 'You sat there talking about hardwood floors.' And we said, 'oh gosh, you really did hear us.'"

Danielle and her husband got the audio messages and repeatedly contacted Amazon, which sent an Alexa engineer to investigate. Danielle said the engineer went through the logs and confirmed the incident, and then apologized about 15 times in 30 minutes.

Amazon told Danielle that the issue was rare, and that it would fix it. The explanation given to Danielle was that the device "just guessed what we were saying." In a statement to The Ambient, Amazon explains what happened.

“Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right”. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”

Amazon offered to "de-provision" Alexa's communications abilities for Danielle so that they could continue to use its smart home features, but Danielle and her husband are instead looking for a full refund on all their Alexa-enabled devices. Amazon thus far has declined.

Danielle and her husband have unplugged all the devices and don't plan on plugging them back in as they they've lost trust in Alexa. That's the fine line companies will have to toe as they continue to put smart assistants in every product. It probably was just an honest mistake, but just like us humans feel betrayed by others when they reveal private details without permission, the same goes for AI assistants. Humans can repent and take actions to show that they've learned their lesson, but it's a little harder for companies and AI to do the same.

Update: This post has been updated with Amazon's statement on the incident.

TAGGED    amazon    smart home

Related stories

amazon The best Alexa skills for your Amazon Echo speaker
smart home Logitech Harmony Express remote is dead... long live the free Elite replacement
smart home Wink Hub subscription plans delayed: 27 July launch date planned
smart home The best smart home devices: The top picks from our buying guides
amazon How to make Alexa talk faster (or slower)
smart home Ring essential guide: The Ring smart security system in depth