Why Thousands of Amazon Workers Are Listening to Your Alexa Exchanges
"We take the security and privacy of our customers' personal information seriously," Amazon tells PEOPLE of its efforts to improve Alexa
Alexa might not be the only one listening when you ask her to play your favorite song.
According to a report Bloomberg published this week, Amazon uses an army of workers to listen to audio clips of customers who use Alexa, its artificial intelligence-powered digital assistant found in the popular line of Echo smart speakers and other devices.
The report details how Amazon workers in the U.S., Costa Rica and Romania listen to about a thousand voice recordings during nine-hour shifts, and transcribe and annotate these audio clips before placing the information back into Alexa’s software.
This, the company tells PEOPLE, is part of an effort to improve Alexa’s intelligence and ability to recognize human speech.
“We take the security and privacy of our customers’ personal information seriously. We only annotate an extremely small number of interactions from a random set of customers in order to improve the customer experience,” Amazon says in a statement. “For example, this information helps us train our speech recognition and natural language understanding systems, so Alexa can better understand your requests, and ensure the service works well for everyone.”
Many of the audio clips Amazon’s employees have come across could be considered “mundane,” Bloomberg reported, but sometimes the recordings possibly captured something “upsetting” or “criminal.”
Two workers reportedly said they have even listened to a possible sexual assault. According to the Bloomberg report, “Amazon says it has procedures in place for workers to follow when they hear something distressing, but two Romania-based employees said that, after requesting guidance for such cases, they were told it wasn’t Amazon’s job to interfere.”
The company says they have “strict technical and operational safeguards” to protect privacy and a “zero tolerance policy” for the abuse by employees.
“Employees do not have direct access to information that can identify the person or account as part of this workflow,” the company continued in their statement. “While all information is treated with high confidentiality and we use multi-factor authentication to restrict access, service encryption, and audits of our control environment to protect it, customers can delete their voice recordings associated with their account at any time.”
Amazon also explained that, by default, its Echo speakers only begin sending audio to the cloud when it is activated with the press of a button or the device hears the chosen wake word, such as “Alexa” or “Amazon.” It doesn’t record all the time, the company said, as the device detects this wake word by identifying acoustic patterns that match it and then begins the process.
Bloomberg reports that, for Siri, Apple also uses human workers who listen to audio clips that don’t have personally identifiable information. Google does the same for its personal assistant but distorted the audio before it is listened to by an employee.
This isn’t the first time Amazon’s devices have made headlines for how they handle privacy. In March 2018, an Oregon couple said an Echo device emailed the audio from their private conversation to a random contact 170 miles away.