Amazon last week confirmed that it keeps transcripts of interactions with Alexa, even after users have deleted the voice recordings.
Based on reports that Amazon retains text records of what users ask Alexa, Sen. Chris Coons in May sent a letter to CEO Jeff Bezos, demanding answers.
The Democrat from Delaware asked Amazon to provide answers on how long it stores transcripts, whether users can delete them, why it collects them and how they are used, and whether the company anonymizes customer identity.
“While I am encouraged that Amazon allows users to delete audio recordings linked to their accounts,” Coons, a member of the judiciary committee, wrote in his letter, “I am very concerned by reports that suggest that text transcriptions of these audio records are preserved indefinitely on Amazon’s servers, and users are not given the option to delete these text transcripts.”
In response, Amazon Vice President of Public Policy Brian Huseman revealed that the tech titan keeps transcripts and voice recordings indefinitely, removing them only when “the customer chooses to delete them.”
“When a customer deletes a voice recording, we delete the transcripts associated with the customer’s account of both … the customer’s request and Alexa’s response,” Huseman wrote in a June 28 memo. “We already delete those transcripts from all of Alexa’s primary storage systems, and we have an ongoing effort to ensure those transcripts do not remain in any of Alexa’s other storage systems.
“We do not store the audio of Alexa’s response,” he continued. “However, we may still retain other records of customers’ Alexa interactions, including records of actions Alexa took in response to the customer’s request.”
Transcripts are stored, according to Amazon, to help improve the digital assistant and “provide transparency” about what Alexa thought she heard and how she responded.
If she makes a mistake—like setting a timer for one minute instead of the four I asked for—the Voice History feature allows users to see and hear how Alexa may have misunderstood a word or phrase.
“Providing customers with the transcript also allows [them] to understand and inspect exactly what Alexa is, and is not, recording,” Huseman added.
Interacting with an Alexa skill, meanwhile, is a whole new ballgame, in which the skill developer may retain records of interactions.
As for other types of requests—recurring alarms, annual reminders, calendar events, texts—customers “would not want or expect deletion of the voice recording to delete the underlying data or prevent Alexa from performing the requested task,” the letter said.
These answers, however, do little to inspire confidence in the company for Coons.
“Amazon’s response leaves open the possibility that transcripts of user voice interactions with Alexa are not deleted from all of Amazon’s servers, even after a user has deleted a recording of his or her voice,” the lawmaker told CNET. “What’s more, the extent to which this data is shared with third parties, and how those third parties use and control that information, is still unclear.”
California lawmakers earlier this year introduced a bill that would limit how manufacturers of smart speakers and digital assistants collect recordings.
The Anti-Eavesdropping Act prohibits saving, storing, or sharing of audio recordings without explicit consent from the user.
Should the legislation pass, it would severely handicap services like Amazon’s Alexa, Siri from Apple, and Google Assistant (among a handful of others).
The Golden State is not the first to try to limit digital eavesdropping: Illinois previously attempted to pass a similar law, which Google and Amazon lobbied against.
Bloomberg recently shocked no one with the news that thousands of Amazon employees around the world listen in on Echo owners’ conversations (allegedly to improve the technology).
More troubling is the fact that workers can also access location information of Alexa users and look up their home or office address.