Seemingly harmless applications for Google Home and Amazon Echo smart speakers can be used to eavesdrop on unsuspecting users, security researchers with SRLabs have discovered. Both speaker systems allow third-party developers to submit software that creates additional commands for customers, referred to as Google Actions and Alexa Skills. Google and Amazon review the software before it is released to the public, but the SRLabs team was able to get around that process by submitting updates to previously approved apps.
Through its video series, SRLabs shows how hackers could take advantage of flaws in voice assistants to continue listening to a user for an extended period of time or even prompt them to hand over their password. The researchers gave Alexa and Google Home a series of characters it could not pronounce, which keeps the speaker silent but listening for further commands from the user.
“It was always clear that those voice assistants have privacy implications—with Google and Amazon receiving your speech, and this possibly being triggered on accident sometimes,” Fabian Bräunlein, senior security consultant at SRLabs, told ArsTechnica. “We now show that, not only the manufacturers, but… also hackers can abuse those voice assistants to intrude on someone’s privacy.”
In addition, the researchers found vulnerabilities that made it simple to generate a fake error message that then prompts the user to enter their password. The phishing hack is hidden within software that allows a speaker to ask for “today’s lucky horoscope.”
There have been no reports that the security vulnerabilities have been used outside of the research. Prior to publishing its series on the issue, SRLabs turned over their research to Google and Amazon, both of which say they have taken steps to address the problems with the smart speakers.