It’s every smart home speaker owner’s worst nightmare: a private message recorded and sent to a recipient without their knowledge. But that’s what happened to a Portland woman, who told CBS News affiliate Kiro-TV that her Amazon Echo recorded an audio clip of her conversation and sent it to a person on her contact list.
The woman, who declined to provide her last name to Kiro-TV, said she was alerted to the bug when the recipient of the message — one of her husband’s employees — called her home to alert her that she’d been “hacked.”
An initial investigation by an Alexa engineer turned up no leads, according to the woman. But the engineer speculated that the Echo speaker “guessed” the command to send a message via Alexa Voice Messaging without asking for verbal confirmation. Normally, Alexa and Google Assistant — which has similar messaging capabilities — alert users when they’re about to send an audio message.
“[Amazon] said, ‘Our engineers went through your logs, and they saw exactly what you told us; they saw exactly what you said happened, and we’re sorry,'” said the woman. “‘He apologized like 15 times in a matter of 30 minutes, and he said, ‘We really appreciate you bringing this to our attention; this is something we need to fix!'”
The internet retailer said it offered to “de-provision” the communication features of the woman’s Echo speaker so that she could continue using its “smart home” features without concern that her voice would be captured and transmitted. The woman, however, is seeking a refund.
“I felt invaded,” she told Kiro-TV. “A total privacy invasion. Immediately, I said, ‘I’m never plugging that device in again because I can’t trust it’.”
We have contacted Amazon for more information and will update you if we hear back.
Update at 2:20 p.m. Pacific: “Echo woke up due to a word in background conversation sounding like ‘Alexa,’ an Amazon spokesperson told VentureBeat. “Then, the subsequent conversation was heard as a ‘send message’ request. At which point, Alexa said out loud ‘To whom?’ At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, ‘[contact name], right?’ Alexa then interpreted background conversation as ‘right’. As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
You can't solo security COVID-19 game security report: Learn the latest attack trends in gaming. Access here