Voice is becoming a primary interface. It’s in many of the technologies we use on a daily basis, like our home appliances, cars, and mobile apps. We can turn off the lights, order takeout, buy our weekly groceries, or listen to our favorite albums, all by using one of the most natural interfaces of all — voice. This is made possible by smart speakers such as Amazon Echo, Alexa, and Google Home.

The convenience these devices bring is boundless, but just how safe is it to set these unassuming devices on our bedside tables or in our living rooms so they can listen to our every word?

A closer look at smart speakers

Voice recognition technology like Apple’s Siri has been around for a while. However, smart speakers are game changers. Based on voice-activated artificial intelligence, smart speakers connect to third-party Internet of Things devices, such as your thermostat or car doors, enabling you to control your environment using your voice. These speakers want to be your virtual assistant; they transform the way you interact with your home, smart devices, and even your favorite brands.

Your voice is only cloud-processed if you say a specific trigger word. Smart speakers are designed to wake up and record as soon as they hear one of their activation words, which means there could be instances where the device stores conversations without the user’s knowledge. One prosecutor even issued a search warrant to see if a suspect’s Echo contained evidence in a murder case.

Furthermore, another device can activate a smart speaker by using the trigger word. This was something Burger King took advantage of in its recent TV ad, which has just won Cannes Lions’ prestigious Grand Prix award. At the end of the ad, the actor triggers Google Home to wake up and recite the Whopper Burger’s Wikipedia description, by saying “OK Google, what is the Whopper burger?” All this leads us to ask: Just how private can a home with voice-activated microphones really be?

Is your privacy at risk?

Smart speakers are equipped with a web-connected microphone that is constantly listening for a trigger word. When a user triggers a smart speaker to make a request, the device sends the command to a server that processes the request and formulates a response. The device stores audio clips remotely, and with both Amazon’s and Google’s devices, you can review and delete them online. However, it is not clear whether the data stays on servers after it is deleted from the account.

At the moment, devices only record requests, but as their capabilities become more advanced, there could be more sensitive data collected, like transcripts from phone calls and dictation for emails. So where will this data be stored?

Also, can hackers exploit the backdoor coding of these devices and listen to what you’re saying? Well, nothing is impossible, but both Google and Amazon have taken precautions to prevent wiretapping. In addition, the audio file that is sent to their data centers is encrypted, meaning that even if your network were compromised, it is unlikely a hacker could use smart speakers as listening devices. Someone getting ahold of your Amazon or Google password and seeing your interactions is the biggest risk, so make sure you use a strong password — you could even consider two-factor security.

Protecting yourself

If the potential for your smart speaker to listen in on your conversations still makes you uneasy, you can manually put the device on mute or change your account settings to make your device even more secure. Some settings you could turn on for increased privacy might include password protection for purchases and audible notifications that let you know when the speaker is active and recording. You can also log onto your Amazon or Google account and delete your voice history (either individually or in bulk).

  • To do this for your Google device, head over to myactivity.google.com, click the three vertical dots in the “My Activity” bar, and hit “Delete activity by” in the drop-down menu. Click the “All Products” drop-down menu, choose “Voice & Audio,” and click to delete.
  • For Amazon’s speaker, go to amazon.com/myx, click the “Your Devices” tab, select your Alexa device, and click “Manage voice recordings.” A pop-up message will appear, and all you need to do is click “Delete.”

However, please note that deleting your history on your smart speaker may affect the personalization of your experience.

Developers could also use privacy-by-design assistants, such as Snips. However, use may be limited due to the lack of internet connection on these devices.

The privacy/convenience tradeoff

Given the evolution rate of the smart speaker and IoT industries, it’s safe to assume they will become more and more present in our daily lives. This means it is essential to understand how they work and what you can do to prevent them from breaching your privacy.

Yes, theoretically smart speakers could pose a threat to privacy. However, they are not terribly intrusive, as they only record when triggered by a specific word or phrase. There is also a low likelihood of someone intercepting a private conversation that was accidentally recorded by your device. Google, Amazon, and other sites have logged our web activity for years; now the companies are starting to collect voice snippets. In the pursuit of convenience, privacy is sometimes sacrificed. In this situation, convenience comes out on top for most users.

Hicham Tahiri is CEO at SmartlyAI, a platform for creating, deploying, and monitoring conversational applications.