Google Assistant no longer stores recordings of users’ voices by default, the company announced today. VentureBeat has reached out to Google for details about when its policy on storing voice recordings was actually changed.
Now people who interact with Google Assistant will have to opt in when setting up their Google Assistant if they want to have their voice recorded or reviewed by humans through the new Voice & Audio Activity (VAA) program. This data can be used to improve Google Assistant’s conversational AI or Google’s ability to recognize a person’s voice. Existing Voice Match users will be prompted to reconfirm that they want Google to store their recordings.
Google Assistant users will also soon be given the option to reduce the sensitivity of the AI assistant’s response to the “Hey Google” command.
The series of changes was introduced after Google paused human review of recordings this summer.
“It’s clear that we fell short of our high standards in making it easy for you to understand how your data is used, and we apologize,” the company said in a blog post.
After hearing a wake word, AI assistants like Cortana, Siri, Alexa, and Google Assistant use voice recordings to improve conversational AI systems trained with labeled and unlabeled data. When the year began, neither the media nor the public knew that humans were listening to interactions recorded by AI assistants, but the practice has come to light in recent months, inflaming fear that smart speakers can be used to eavesdrop on individuals in the privacy of their homes or offices.
The truth became clear this spring, when Bloomberg reported and Amazon confirmed that human contractors in various parts of the world listen to voice recordings after hearing the “Alexa” wake word or similar sounds.
Then in July a third-party contractor leaked Google Assistant voice recordings for users in the Netherlands. A day later, Google verified that humans review some recordings, less than 1%, the company said shortly after the news broke. We learned that Siri’s recordings are also reviewed by humans, and Cortana’s as well.
In response, both Apple and Google pledged to halt human reviews, and Amazon allowed users to disable human voice recording reviews.
Lawmakers in states like California and Illinois have this year considered legislation requiring the makers of AI assistants to gain permission from users before recording their voice data.
Read this VentureBeat article for instructions on how to limit or get rid of voice recordings from the makers of popular AI assistants like Alexa, Bixby, and Google Assistant.