Smart speaker owners were alarmed this past year, or at least should have been alarmed, that recordings of requests of the speakers were being listened to routinely by employees of the companies.
But they may not be the only ones listening in. Third-party app developers with a nefarious intent can create Amazon skills and Google Home actions that may seem benign but are actually “smart spies.”
Third-Party Smart Speaker Apps
That’s right — now you also need to worry about third-party apps on your smart speaker. You know you have to be wary of them on your computer and mobile devices, but now you also need to worry about the true intention of third-party apps on your smart speaker.
Whitehat hackers from Security Research Labs in Germany developed eight apps: four Alexa skills and four Google Home actions. Google and Amazon approved every one of the apps.
All of the apps checked for horoscopes except for one that was a random-number generator. But these apps were “smart spies” that could eavesdrop on the smart speaker users and phish passwords.
“It was always clear that those voice assistants have privacy implications — with Google and Amazon receiving your speech, and this possibly being triggered on accident sometimes,” said Fabian Bräunlen, senior security consultant at SRLabs.
“We now show that not only the manufacturers, but … also hackers can abuse those voice assistants to intrude on someone’s privacy.”
While the eight apps were named differently and all worked slightly different from the others, they shared similar flows.
When a user would say, “Alexa, ask My Lucky Horoscope to give me the horoscope for Taurus” or “Okay Google, ask My Lucky Horoscope to give me the horoscope for Taurus,” the eavesdropping apps provided the Taurus horoscope, but the phishing apps gave a fake error message.
It then appeared as if the apps were no longer running, but they were actually lying in wait for the next step in the planned attack.
The eavesdropping apps were still listening, logging all conversations within listening distance of the speaker, sending a copy to a service designated by the server.
With the phishing apps, after the error message that said the skill or action wasn’t available in the user’s country, it went silent, making it appear the app stopped running. But the apps instead used a voice that mimicked the Alexa and Google Assistant and claimed there was a device update available, asking for the password to install it.
Google and Amazon Response
The apps all used simple ways of hiding their malicious behavior. Make Tech Easier isn’t going to spell out the methods to provide anyone with the knowledge of how to bypass the system. The website is not called Make Hacking Easier.
But note that the information is out there, so certainly Amazon and Google should know the same and do now after being alerted by SRLabs. They both said they are changing how apps are approved to prevent this in the future. Yet, it shouldn’t have taken whitehat hackers to alert them to this.
Does this change the way you think of or will use Alexa or Google Home speakers? Tell us how it may affect you in the comments below.