From the Department of Adding Insult to Injury, Amazon workers listen in to their product’s recordings made by the consumers who forked over big bucks for the honor and the privilege of owning “smart” convenience devices which, by the way, are designed to let the manufacturer spy on you.
Bloomberg broke the story on April 10, 2019, reporting that thousands of employees around the world help improve “the Alexa digital assistant powering its line of Echo speakers” by listening to voice recordings “captured in Echo owners’ homes and offices.”
The online retailer’s objective is to improve its signature product’s recognition algorithm that uses educated guesswork and conversational context to produce accurate and meaningful search results. For example, if you commanded Alexa to find “Greek nearby,” the device has to make a decision: are you looking for food, a place to worship, or a fraternity?
Humans help Echo/Alexa produce best-possible results by giving the machine’s software feedback on its answers.
According to Amazon, the workers transcribe and annotate the recordings before entering them back into the software “as part of an effort to eliminate gaps in Alexa’s understanding of human speech and help it better respond to commands.”
Little is known about this international assortment of permanent and contract personnel, all of whom have signed nondisclosure agreements that prevent them from talking about what they are really doing.
We do know, thanks to two Amazon workers in Bucharest, Romania, that they labor nine hours a day. Each reviewer processes up to 1,000 audio clips per shift. The modern Amazon facility occupies the top three floors of the Globalworth Building, located in the trending Pipera district. No exterior signage blazons the fact that this is an Amazon office.
One Amazon employee in Boston said the work is, for the most part, routine, saying “he mined accumulated voice data for specific utterances such as ‘Taylor Swift’ and annotated them to indicate the searcher meant the musical artist. Occasionally the listeners pick up things Echo owners likely would rather stay private: a woman singing badly off key in the shower, say, or a child screaming for help. The teams use internal chat rooms to share files when they need help parsing a muddled word—or come across an amusing recording.”
If workers overhear a conversation with possible criminal content, corporate policy is one of non-interference. Two workers said they heard what they believed was a sexual assault but could do little more than “share the experience in the internal chat room as a way of relieving stress.”
Evidently, the allure of having a “smart home” rigged with Amazon (or a competitor’s) utilities that regulate the garage door, residential heating/cooling system, and entertainment systems, outweighs the blatant stripping of citizen rights to privacy and warrantless search and seizure.
A growing number of people want to enjoy Alexa’s smartness without her tattling every shred of personal conversation recorded during its “wake” mode, invoked after a keyword such as “Alexa” is detected. (When Alexa detects the wake word, the light ring at the top of the Echo turns blue, indicating the device is recording and transmitting a computer command to Amazon servers.)
Online sites such as tom’s guide have published step-by-step instructions on how to disable Echo/Alexa so that those linguistic analysts can’t eavesdrop on your private conversations.
In five easy steps, you, too, can opt out of the factory default setting that allows Amazon employees access to all your audio recordings:
- Open the Alexa app and tap to select ‘Settings.’
- Tap to select ‘Alexa Account’ (top of the list)
- Select ‘Alexa Privacy’ (bottom of the list)
- Select ‘Manage How Your Data Improves Alexa’ (see where we’re going with this?)
- Toggle both “Help Develop New Features” and “Use Messages to Improve Transcriptions” to Off.
According to tom’s guide, “Alexa will no longer learn and improve from your responses, but your recordings will be safe and sound.”
Most Amazon Echo/Alexa users have no idea these settings even exist. As a note of caution, the Alexa app claims that turning off the data-sharing setting may prevent new features from working properly.
In addition, even though disabling the data-sharing setting will stop your recordings from being transmitted to Amazon Central for analysis, the online retail leader has indicated that customers who “opt out of that program might still have their recordings analyzed by hand over the regular course of the review process.”
Amazon failed to specify what hand analysis consists of, but we do know that devices equipped with Alexa have been known to produce false positive wake states. More concerning is that Bloomberg “reported that snippets of things said to Alexa are accompanied by a user’s first name and account number, as well as the smart speaker’s serial number.”
Frankly, Amazon would be wise to jump-to and tighten up their security protocols or risk suffering the same fate as Facebook which continues to reel financially from multiple privacy-violation lawsuits in the wake of Cambridge Analytica and other third-party data theft scandals.