Amazon Echo sends long recording of couple's private conversation to random person

'Unplug your Alexa devices right now. You're being hacked.'


Andrew Griffin
Thursday 24 May 2018 22:23 BST
The Amazon Echo, a voice-controlled virtual assistant, is seen at its product launch for Britain and Germany in London, Britain
The Amazon Echo, a voice-controlled virtual assistant, is seen at its product launch for Britain and Germany in London, Britain

A family had a private conversation broadcast across the internet to a friend by their Amazon Echo.

The horrifying discovery began with an equally chilling phone call, telling them to unplug their devices immediately. "You're being hacked," the voice said.

As it turns out, that person had accidentally been sent a full recording of the family's conversation. The sounds had been picked up and then sent by the Echo to a friend, who could then hear the entire conversation.

The family initially didn't believe that the message could have been sent. Then the person who received it said, "You sat there talking about hardwood floors", and the family realised that they had accidentally been broadcast across the internet.

A woman only identified as Danielle said that she would not be using the devices again. "I can't trust it," she told local news station KIRO 7, which first reported the story.

Amazon said that the chilling recording was the result of an entirely accidental and unlucky series of events. The Alexa voice inside the Echo has a series of checks to ensure that messages are not sent without the explicit consent of the people stood around the talking cylinder – but somehow Alexa mistakenly thought that those checks had been satisfied.

"Echo woke up due to a word in background conversation sounding like 'Alexa'," an Amazon spokesperson said. "Then, the subsequent conversation was heard as a ;'send message' request. At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customers contact list. Alexa then asked out loud, '[contact name], right?' Alexa then interpreted background conversation as 'right'. As unlikely as this string of events is, we are evaluating options to make this case even less likely."

Amazon had told the family that it had investigated what had happened by listening to logs of the conversations with Alexa. It wasn't clear what Echo device was being used.

Alexa has a series of checks built into it to ensure that it can't listen in on conversations without permission, and certainly not to send it. Chief among them is the "wake word" that is built into the Echo – the voice assistant cannot start listening or sending recordings over the internet until it hears the word "Alexa".

But occasionally those checks can go awry, usually by accident. The fact that the cylinders are inside the house, listening and speaking on command, can make those issues especially creepy.

Earlier this year, for instance, Echo speakers started refusing to fulfil requests and then laugh at their users. The strange behaviour was the result of a misheard command – but that didn't make the bizarre chuckle any less creepy.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies


Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in