How to Stop Alexa Gossiping About You

Should we be worried about this invasion of privacy?


One of Amazon’s voice-activated devices has been caught allegedly emailing a private conversation to a random contact in the user’s address book.

It seems improbable, but Alexa -- the software inside the Echo -- heard the wrong command four times in a row.

According to the user, named Danielle, it woke and recorded a conversation between her and her husband -- mishearing the command. It then asked who to send it to, detected a person’s name, then matched the sound with a person in the user’s contact list.

Then, it misheard the user’s confirmation to send them a recording of their conversation, which was -- fortunately -- fairly mundane. This sequence of events appears to have been confirmed by Amazon in a statement.

The family had Echo devices all over the house as part of their smart home system, but now say they’ve unplugged them. The chances of this happening randomly are undoubtedly low, but should we be worried about this invasion of privacy?

Smart Home or Surveillance?

Devices like Amazon’s Echo, Apple’s Siri, and Google’s Assistant (which runs on Google Home devices) are all voice-activated. They constantly listen out for their ‘wake word’, and then respond to the words after it. But they do make mistakes.

Even if you don’t ask the device to record you, it will record whenever it hears its wake word. The recording is then sent to the cloud for processing, training the company’s algorithm, or improving its recognition of your voice in the future.

If you have any voice-activated devices in your home, you’ve probably activated them by accident, or heard them spring to life in response to something on the TV. There’s also a chance that any device could be hacked, which is rarer but theoretically riskier.

These companies are not out to get you, but the chance that mistakes can happen should make you wary.

How to Delete Voice Recordings

For those of us who already use voice-activated devices, there is a way to clean up old recordings to ensure that they aren’t going to be hacked in the future.

If you use a Google device, you’ll find your records in the Voice & Audio section of your Activity History. Amazon provides similar controls in its Alexa Privacy section. There are instructions for Cortana and Siri users here.

But that just deletes your history. If you’re very privacy-conscious, you probably shouldn’t have any voice-activated devices in your home. At the very least, it’s a good idea not to place these kinds of devices that listen in places where you might have sensitive conversations -- like your home office, and definitely not in your bedroom.

 

If you have a voice-activated assistant on your phone, it may be switched on by default. You can disable it in the device settings so that it doesn’t pick up your voice accidentally when it’s in your pocket. If you aren’t using the feature, it’s better to be safe than sorry.

Award winning antivirus protection from TotalAV. Stay 100% safe from malware and online threats.

Security & Privacy

NHS & Amazon Alexa Join Forces, Security Experts Divided
Following the NHS’ partnership with Amazon’s Alexa, querying health concerns may have just become even more efficient, or is this a chance for Amazon to know more personal information?
15 July 2019

Security & Privacy

Can Digital Wi-Fi Baby Monitors Get Hacked?
If your baby monitor is used over the internet or sends data to a remote location like the cloud or a home server, it could potentially be exploited by a cyber criminal prowling the web for vulnerable devices
15 July 2019

Industry Latest

Samsung Smart TV Virus Scan Tweet Raises Concerns
Samsung ruffle some feathers after encouraging customers to scan their smart TVs for viruses
26 June 2019