Amazon Echo recorded household audio, sent it to random contact [u]
An Oregon family's Amazon Echo recorded household audio and sent it to an employee of the family's husband, something Amazon blamed on a rare bug that it intends to fix. [Updated with additional Amazon explanation]

The employee called the family two weeks ago to inform them of what happened, and warn them to unplug their Alexa devices, according to KIRO 7. They did so, and while the husband initially disbelieved the story, the employee was able to share details of audio files, such as a conversation about hardwood floors.
The wife in the family, Danielle, called Amazon multiple times. An Alexa engineer investigated, confirming the situation through logs without detailing how the incident might have happened. The company did offer to de-provision the family's Alexa communications so they could continue to control smarthome devices, but Danielle said she's been fighting with representatives to secure a refund.
"Amazon takes privacy very seriously," the company said in a statement to KIRO. "We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future."
Critics of smartspeakers by Apple, Amazon, Google, and others have worried that the devices could be used as spy tools, whether by corporations, governments, or independent hackers. By design the speakers have to communicate with remote servers to interpret voice commands, only performing a basic amount of processing locally.
No incidents have been reported with the HomePod so far, but the product only launched on Feb. 9, and as few as 600,000 units may have been sold in the March quarter.
Update: An Amazon spokesperson contacted AppleInsider to provide the following statement:
"Echo woke up due to a word in background conversation sounding like 'Alexa.' Then, the subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right.' As unlikely as this string of events is, we are evaluating options to make this case even less likely."

The employee called the family two weeks ago to inform them of what happened, and warn them to unplug their Alexa devices, according to KIRO 7. They did so, and while the husband initially disbelieved the story, the employee was able to share details of audio files, such as a conversation about hardwood floors.
The wife in the family, Danielle, called Amazon multiple times. An Alexa engineer investigated, confirming the situation through logs without detailing how the incident might have happened. The company did offer to de-provision the family's Alexa communications so they could continue to control smarthome devices, but Danielle said she's been fighting with representatives to secure a refund.
"Amazon takes privacy very seriously," the company said in a statement to KIRO. "We investigated what happened and determined this was an extremely rare occurrence. We are taking steps to avoid this from happening in the future."
Critics of smartspeakers by Apple, Amazon, Google, and others have worried that the devices could be used as spy tools, whether by corporations, governments, or independent hackers. By design the speakers have to communicate with remote servers to interpret voice commands, only performing a basic amount of processing locally.
No incidents have been reported with the HomePod so far, but the product only launched on Feb. 9, and as few as 600,000 units may have been sold in the March quarter.
Update: An Amazon spokesperson contacted AppleInsider to provide the following statement:
"Echo woke up due to a word in background conversation sounding like 'Alexa.' Then, the subsequent conversation was heard as a 'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right.' As unlikely as this string of events is, we are evaluating options to make this case even less likely."
Comments
Hypocrites everywhere.
Unlike Amazon, which is the ... <checks notes> ... umm, second largest company in the world.
It would be embarrassing if the conversations was coming from the bedroom, ouches....that sounds??
I have noticed occasions where the Echo, and to a much lesser degree the HomePod, will randomly "respond" despite the lack of a trigger phrase. The responses are typically along the lines of "I don't understand what you're asking" with no subject. I attribute some of these false detections being caused by high amplitude impulsive noise sources (very short duration noise spikes) like something being bumped, dropped, door closed, etc., or a helicopter transiting overhead. Impulsive time domain sources generate broadband frequency domain response so the detection processing is probably getting flooded with enough data in some of these impulse cases to trigger the low confidence level portion of the algorithm that causes the device to ask for clarification.
On at least two different occasions, I came home to them quietly playing songs from the 1950s - a genre I never listen to.
The other night, I was in bed when I heard Alexa mumbling something in another room for at least 10-15 seconds even though the house was completely quiet. By the time I realized what was talking, it had stopped. I have no idea what she was talking about. She might have been trying to have a conversation with Siri!