UK markets closed
  • NIKKEI 225

    27,819.33
    -180.63 (-0.65%)
     
  • HANG SENG

    20,082.43
    +471.59 (+2.40%)
     
  • CRUDE OIL

    93.99
    +2.06 (+2.24%)
     
  • GOLD FUTURES

    1,805.50
    -8.20 (-0.45%)
     
  • DOW

    33,336.67
    +27.16 (+0.08%)
     
  • BTC-GBP

    19,821.59
    +200.12 (+1.02%)
     
  • CMC Crypto 200

    573.13
    -1.61 (-0.28%)
     
  • ^IXIC

    12,779.91
    -74.89 (-0.58%)
     
  • ^FTAS

    4,131.26
    -19.42 (-0.47%)
     

Amazon Alexa tells 10-year-old child to give herself an electric shock for a ‘challenge’

·2-min read
Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information (Getty)
Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information (Getty)

Amazon’s Alexa voice assistant recommended to a 10-year-old that she give herself an electric shock as part of a “challenge”.

Kristin Livdahl posted on Twitter that the voice assistant recommended the action after her daughter asked for a challenge.

“Here’s something I found on the web”, Alexa replied. “The challenge is simple: plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.”

Ms Livdahl said that she and her daughter were doing some “physical challenges” and that her daughter wanted another one.

“I was right there and yelled, ‘No, Alexa, no!’ like it was a dog. My daughter says she is too smart to do something like that anyway”, she tweeted.

Amazon says that it has now removed the challenge from its database.

“Customer trust is at the centre of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers,” an Amazon spokesperson said in a statement. “As soon as we became aware of this error, we took swift action to fix it.”

Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information – and as such can provide false or offensive results.

In December 2020, Alexa was found to be repeating conspiratorial and racist remarks. Asked if Islam is evil, one result returned by Alexa was: “Here’s something I found on the web. According to [a website], Islam is an evil religion.”

In 2018, Apple’s Siri voice assistant thought that Donald Trump was a penis, due to someone vandalising the then US president’s Wikipedia page and Siri pulling the information from there.

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting