Amazon's fall devices event was both poorly and excellently timed. Poorly timed because Amazon was trying to sell us more microphones following a string of privacy scares around how voice companies were handling our data.
Excellently timed because it gave Amazon an opportunity to prove it was listening by rolling out new privacy features for Alexa.
The tech giant introduced an auto-delete option, where customers can opt-in to have old recordings automatically wiped from Amazonâs servers, and new utterances such as âAlexa, tell me what you heardâ and âAlexa, why did you do that?â to offer more transparency in why Alexa behaves in certain ways.
Read this: All the new Alexa features incoming
David Limp, Amazonâs senior vice president of devices, kicked off the event by emphasizing the corporationâs commitment to privacy: âPrivacy cannot be an afterthought when it comes to the devices and services we offer our customers. It has to be foundational and built in from the beginning for every piece of hardware, software, and service that we create.â
He also reiterated that Alexa was the first voice assistant to allow customers to opt out from having their voice recordings annotated by humans, although both Google and Apple have changed to the more preferable opt-in.
But do such add-ons go far enough to protect usersâ privacy and data, or are they mostly an easy way to placate the pitchfork-wielding crowd? âCompanies should be treating privacy more as a feature rather than a compliance issue,â says Florian Schaub, an assistant professor in the School of Information at the University of Michigan.
âBuilding these features into how your system works creates an opportunity to make them less obtrusive and part of how you use a product.â.
Today, over a quarter of American adults own a smart speaker, and an estimated 2.5 billion voice assistants were in use at the end of 2018. While Google and Apple have made some headway into smart speaker market share, Amazon is still well ahead of the pack.
Alexa, delete what I just said
As the popularity of smart speakers and voice assistants in U.S. households continues to grow, so do major concerns about technology invading our privacy while weâre at home. From tech workers listening to confidential recordings to childrenâs devices being accused of violating privacy laws, the controversies related to smart home privacy have made headline after headline.
But these companies are now responding by giving customers more control over their privacy.
âWhile Amazon is moving in the right direction regarding Echo security, there is still a valid concern that the response is very reactionary,â says Mark Williams, healthcare expert at PA Consulting.
Limp described the new features, most notably an auto-delete option that gives users the ability to have voice recordings older than 3 or 18 months automatically deleted from Amazonâs servers on an ongoing basis. This builds on Amazonâs previous changes from May, which allowed customers to say, âAlexa, delete everything I said todayâ or âAlexa, delete what I just said.â
âItâs useful that Amazon now introduces this auto-deletion feature, but thereâs question of trustworthiness. Will it actually work?â says Schaub. âAnd you have to activate this setting, which is surprising to me. If people want to keep their recordings longer, then they can do it. I donât quite understand why itâs opt-in rather than just default for everyone.â
Users first have to be aware that such an auto-delete option exists, and second, they must take the steps to open the Alexa Privacy Hub website or app to change the deviceâs default settings.
I think itâs a band-aid thatâs covering up the bigger issue here
In a research study from last year, Schaub and his colleagues found that people who owned smart speakers rarely used privacy controls and had an incomplete understanding of privacy risks. So he wonders about the likelihood of customers actually activating the auto-delete function.
The second new feature aims to increase transparency around what Alexa is doing. Customers can say, âAlexa, tell me what you just heard" to hear their last voice request, or "Alexa, why did you do that?" to get a brief explanation of the deviceâs last response. Vitak isnât too convinced about the utility of these voice commands.
âI think itâs a band-aid thatâs covering up the bigger issue here." says Jessica Vitak, an associate professor in the College of Information Studies at the University of Maryland. "While it could be useful in situations where the device is accidentally triggered to see what it actually recorded in that space, I feel like the main purpose is to build trust with the end user.
"But the reality is that the average consumer is not the expert on understanding technology, which is presented in a way that focuses exclusively on the benefits to you and downplays what is happening with your data.â
Vitak and her colleagues reported in a recent study that users of voice assistants had lower levels of privacy concerns than non-users, and users also had high confidence that the companies behind the technology would ensure the use of their devices was private, safe, and secure.
âThe home is considered one of the last private spaces. With devices like Alexa and Google Home, weâre inviting technology into that private space â and sometimes itâs a very private space, like a bedroom,â says Vitak. âNow we have that data being captured, stored, perhaps even analyzed, and itâs not clear how protected that data is.â
In April, it came to light that Amazon employs human reviewers that listen to voice recordings captured in homes and offices by Echo smart speakers. A few months later, a contractor leaked more than 1,000 of Google Assistant users, and Apple contractors reported regularly hearing confidential information like doctor-patient discussions, drug deals, and people having sex.
âThese voice assistants and smart speakers are based on having a microphone that is always on and always listening,â says Florian Schaub. âWhile these systems all have activation keywords like âHey Alexa,â there is an issue of false activation and recording when they are not supposed to be recording.â
Experts also mentioned that the onus is on Amazon and other corporations to help their users understand what data is being collected, how it is being used, and what kind of control users have over the whole process. People should understand the risks involved with having smart speakers and voice assistants in the home, just as well as they understand all the benefits and cool features.
âIf companies would be more proactive in making privacy the default and show how data is being protected, rather than having these privacy add-ons that are only relevant if youâre tech-savvy enough, they could get more people to adopt the technology."
More to read