Conversations with Google Home or Amazon Alexa have never been strictly confidential — both companies have admitted that they send some audio snippets to workers who listen to voice recordings to help improve the software.
But a group of whitehat hackers have now demonstrated that third-party apps hosted by Google Home or Alexa can also log users' conversations, even after tricking users into thinking the apps aren't active.
Developers at Germany's Security Research Labs created four Alexa "skills" and four Google Home "actions" that pose as astrology apps or random number generators but are designed to secretly listen to people's voice and send a transcript back to third-party servers. Certain versions of the app mimic Alexa or Google Assistant, pretending to offer a software update and asking users to input their password.
All eight of the apps passed Amazon or Google security checks, meaning they could have been made available for public download on either platform, according to the researchers.
"Customer trust is important to us, and we conduct security reviews as part of the skill certification process," an Amazon spokesperson told Business Insider. "We quickly blocked the skill in question and put mitigations in place to prevent and detect this type of skill behavior and reject or take them down when identified. It's also important that customers know we provide automatic security updates for our devices, and will never ask them to share their password."
A Google spokesperson told Business Insider that the company is taking steps to prevent similar issues going forward.
"All Actions on Google are required to follow our developer policies, and we prohibit and remove any Action that violates these policies. We have review processes to detect the type of behavior described in this report, and we removed the Actions that we found from these researchers. We are putting additional mechanisms in place to prevent these issues from occurring in the future," the Google spokesperson said.
Here's how the apps work: First, they gave users the expected message — either a randomly generated number or a brief horoscope. Next, the apps go silent, giving users the impression that the software has closed, while still listening to conversations and sending a copy of transcripts to a third-party server.
The malicious apps can also impersonate Alexa or Google Home to ask users for sensitive information. As demonstrated in the videos below, the apps give the impression that the software has closed, then impersonate Alexa to prompt users to input their password to download a software update.
The researchers have already taken the apps offline and said they have privately reported their findings to Google and Amazon.
This article originally appeared on Business Insider. Follow @BusinessInsider on Twitter.
- Amazon's new patent shows just how much creepier Alexa can get ›
- 13 Alexa-enabled smart home gifts that tech people will love ... ›
- Amazon Alexa records private conversation and sends it to a friend ... ›
- Google's culture has major problems, according to new report ... ›
- Google honors Apollo 11 software developer Margaret Hamilton ... ›