Karl Bode | Techdirt | Source URL
I’ll admit that I traditionally haven’t been as paranoid as many people in regards to the surveillance powers of digital assistants like Amazon’s Alexa or Google Home. Yes, putting an always-on microphone in your home likely provides a wonderful new target for intelligence agencies and intruders to spy on you. That said, it’s not like a universe of internet of broken things or smart TVs aren’t doing the same thing, before you even get to the problem with lax to nonexistent privacy standards governing the smartphone currently listening quietly in your pocket and tracking your every location.
That said, nobody should ever labor under the false impression that good opsec involves leaving always on, internet-connected microphones sitting everywhere around your house.
One Portland family learned this the hard way when their Amazon Alexa unit recorded a part of a private conversation and randomly sent it to somebody in her contact list. According to local Seattle affiliate Kiro 7, the family was contacted by a coworker who stated that he was receiving audio files of private conversations that had occurred in the family’s house:
“We unplugged all of them and he proceeded to tell us that he had received audio files of recordings from inside our house,” she said. “At first, my husband was, like, ‘no you didn’t!’ And the (recipient of the message) said ‘You sat there talking about hardwood floors.’ And we said, ‘oh gosh, you really did hear us.'”
Danielle listened to the conversation when it was sent back to her, and she couldn’t believe someone 176 miles away heard it too.
“I felt invaded,” she said. “A total privacy invasion. Immediately I said, ‘I’m never plugging that device in again, because I can’t trust it.'”
To its credit, Amazon quickly came clean and confirmed that this happened without the kind of idiotic denials and subsequent tap dancing you might normally see from a company in 2018. In a statement, the company indicated that the leak was an “extremely rare occurrence” where Alexa repeatedly seemed to misunderstand random words as commands:
“Echo woke up due to a word in background conversation sounding like “Alexa.” Then, the subsequent conversation was heard as a “send message” request. At which point, Alexa said out loud “To whom?” At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, “[contact name], right?” Alexa then interpreted background conversation as “right.” As unlikely as this string of events is, we are evaluating options to make this case even less likely.”
This really does seem to be a rare occurrence where the unit simply misinterpreted what was said, and the owners either ignored (or couldn’t hear) the unit repeatedly asking for confirmation. That said, nothing about this story is going to ease those justly paranoid about the potential here for abuse, especially in a country where meaningful punishment for massive privacy violations are often nonexistent (looking at you, Equifax), and existing privacy protections are either being eliminated or have all the teeth of modestly-damp cardboard.