Stay up to date with notifications from The Independent

Notifications can be managed in browser preferences.

Amazon Alexa tells 10-year-old child to give herself an electric shock for a ‘challenge’

Amazon says that it has now removed the challenge from its database

Adam Smith
Wednesday 29 December 2021 18:50 GMT
Comments
Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information
Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information (Getty)

Amazon’s Alexa voice assistant recommended to a 10-year-old that she give herself an electric shock as part of a “challenge”.

Kristin Livdahl posted on Twitter that the voice assistant recommended the action after her daughter asked for a challenge.

“Here’s something I found on the web”, Alexa replied. “The challenge is simple: plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.”

Ms Livdahl said that she and her daughter were doing some “physical challenges” and that her daughter wanted another one.

“I was right there and yelled, ‘No, Alexa, no!’ like it was a dog. My daughter says she is too smart to do something like that anyway”, she tweeted.

Amazon says that it has now removed the challenge from its database.

“Customer trust is at the centre of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers,” an Amazon spokesperson said in a statement. “As soon as we became aware of this error, we took swift action to fix it.”

Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information – and as such can provide false or offensive results.

In December 2020, Alexa was found to be repeating conspiratorial and racist remarks. Asked if Islam is evil, one result returned by Alexa was: “Here’s something I found on the web. According to [a website], Islam is an evil religion.”

In 2018, Apple’s Siri voice assistant thought that Donald Trump was a penis, due to someone vandalising the then US president’s Wikipedia page and Siri pulling the information from there.

Join our commenting forum

Join thought-provoking conversations, follow other Independent readers and see their replies

Comments

Thank you for registering

Please refresh the page or navigate to another page on the site to be automatically logged inPlease refresh your browser to be logged in