Check out the on-demand sessions from the Low-Code/No-Code Summit to learn how to successfully innovate and achieve efficiency by upskilling and scaling citizen developers. Watch now.
Amazon Alexa users can now choose whether human reviewers listen to recordings of their exchanges with the AI assistant, a company spokesperson told VentureBeat. Like Siri, Cortana, and Google Assistant, Alexa records every exchange with users of Echo speakers and similar devices as a way to gauge how successful the assistant is in fulfilling user commands and improve performance of its natural language understanding AI.
Bloomberg reported in April that Amazon uses human annotators and reviewers to verify Alexa language understanding interpretation, news that drew the ire of privacy advocates and the wider public. Since then, news reports have highlighted similar practices by other tech giants.
Earlier this week, Google and Apple both pledged to suspend some of their voice data review by people. Following the leak of Google Assistant user voice recordings last month, Google said human reviewers listen to voice recordings to improve the assistant’s ability to recognize accents, languages, and dialects.
In a statement provided to VentureBeat about the change introduced Friday, an Amazon spokesperson said:
Intelligent Security Summit
Learn the critical role of AI & ML in cybersecurity and industry specific case studies on December 8. Register for your free pass today.
We take customer privacy seriously and continuously review our practices and procedures. For Alexa, we already offer customers the ability to opt-out of having their voice recordings used to help develop new Alexa features. The voice recordings from customers who use this opt-out are also excluded from our supervised learning workflows that involve manual review of an extremely small sample of Alexa requests. We’ll also be updating information we provide to customers to make our practices more clear.
To ensure people don’t listen to voice recordings collected following each exchange with Alexa, go to Settings, tap the Alexa Privacy link, and choose Manage How Your Data Improves Alexa.
Users can also delete their voice recordings via the Alexa app or Amazon website.
The news follows Amazon’s introduction of an “Alexa, delete what I said today” voice command in May. In recent months, lawmakers in state legislatures in Illinois and California proposed legislation that would require makers of AI assistants to receive consent before recording user interactions.
VentureBeat's mission is to be a digital town square for technical decision-makers to gain knowledge about transformative enterprise technology and transact. Discover our Briefings.