Third-party Amazon Echo and Google Home apps are a minefield of scary security flaws

7comments
Third-party Amazon Echo and Google Home apps are a minefield of scary security flaws
Smart speakers are all the rage right now, with global sales growing at a rate reminiscent of the smartphone industry's early days and two companies rising above the vendor pack to vie for the crown. In their battle for world domination, Amazon and Google are trying everything in their power to stand out and one-up each other, pretty much improving the capabilities of their already impressive virtual assistants every day and constantly expanding their hardware portfolios as well.

Unfortunately, while this intense competition in a relatively new market with huge potential for long-term growth has created the ideal environment for rapid development of innovative experiences and applications, the security and privacy of users may have been seriously neglected. To their credit, both Google and Amazon seem to be limiting the data their voice-controlled devices can collect and the possibilities of mishandling said information, but a number of issues and concerns are still standing.

One very delicate matter that hasn't received a lot of media attention involves the vetting process of so-called Alexa "skills" and Google Home "actions." These features that can be added to the two companies' smart home devices via official stores are developed by third parties, which makes them susceptible to outside attacks.

Be careful what your voice assistant asks you to do


Generally, users ask Alexa or Google Assistant questions, as well as give commands, set alarms, and so on. But as evidenced in a pair of videos uploaded to YouTube by a company called SRLabs, your AI-powered assistant could occasionally ask you to do some stuff instead of the other way around. Or at least that's what you might be led to believe by a malicious Echo or Google Home app.

Video Thumbnail

Security Research Labs, which specializes in, well, security research developed such an app with the intent to reveal the shocking vulnerabilities of both Alexa and Google Assistant-enabled devices. As it turns out, it was indeed extremely easy to plant basic Alexa skills and Google Home actions that could "vish" (voice phish) users' passwords by pretending to be legitimate "lucky horoscope" services with regional restrictions of some sort.

Recommended Stories
Video Thumbnail

Said apps could trick users into believing their devices and not the actual apps in question were requesting vocal password verification to install software updates. Of course, that's not how updates work on smart speakers, but this is just one example of a security breach that could be exploited in a very serious way. Other examples include asking for an email address corresponding to said password or even financial information.

Eavesdropping is also incredibly easy


Another test that SRLabs ran to verify the strength of Google and Amazon's app approval mechanisms consisted of manipulating the same innocent-looking horoscope "skill" and "action" into listening in on conversations when theoretically deactivated. Predictably enough, simply asking a third-party app to "stop" giving you previously requested information may not stop the listening process as well.

Video Thumbnail
 

In this case, hacking an Echo and a Google Home work a little differently, but the end result is equally scary. Anything you say around these devices can be used against you in a number of ways that we honestly don't want to talk about in a lot of detail.

Video Thumbnail

Google and Amazon are putting "mechanisms in place" to improve security


Technically, that's only Google's reaction to these troubling revelations made by SRLabs, but Amazon offered Ars Technica a similar albeit less vague statement in which the company highlighted "mitigations" are already in place to "prevent and detect this skill behavior" from now on.

By the way, what SRLabs did was not purely theoretical, publishing malicious apps that passed Google and Amazon's approval process while obviously not exploiting the potential of the skills and actions before contacting the two tech giants and removing the phishing and eavesdropping apps. 

Based on what Google and Amazon are saying, it should no longer be possible for such blatant security vulnerabilities to go unnoticed in the future, but it's probably better to be careful what you download anyway. And remember, don't do anything sketchy that your voice assistant might ask.

Recommended Stories

Loading Comments...
FCC OKs Cingular\'s purchase of AT&T Wireless