Advertisement
UK markets open in 14 minutes
  • NIKKEI 225

    37,628.48
    -831.60 (-2.16%)
     
  • HANG SENG

    17,222.32
    +21.05 (+0.12%)
     
  • CRUDE OIL

    82.87
    +0.06 (+0.07%)
     
  • GOLD FUTURES

    2,332.00
    -6.40 (-0.27%)
     
  • DOW

    38,460.92
    -42.77 (-0.11%)
     
  • Bitcoin GBP

    51,393.61
    -2,036.79 (-3.81%)
     
  • CMC Crypto 200

    1,391.05
    +8.47 (+0.61%)
     
  • NASDAQ Composite

    15,712.75
    +16.11 (+0.10%)
     
  • UK FTSE All Share

    4,374.06
    -4.69 (-0.11%)
     

Amazon Alexa tells 10-year-old child to give herself an electric shock for a ‘challenge’

Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information (Getty)
Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information (Getty)

Amazon’s Alexa voice assistant recommended to a 10-year-old that she give herself an electric shock as part of a “challenge”.

Kristin Livdahl posted on Twitter that the voice assistant recommended the action after her daughter asked for a challenge.

“Here’s something I found on the web”, Alexa replied. “The challenge is simple: plug in a phone charger about halfway into a wall outlet, then touch a penny to the exposed prongs.”

Ms Livdahl said that she and her daughter were doing some “physical challenges” and that her daughter wanted another one.

ADVERTISEMENT

“I was right there and yelled, ‘No, Alexa, no!’ like it was a dog. My daughter says she is too smart to do something like that anyway”, she tweeted.

Amazon says that it has now removed the challenge from its database.

“Customer trust is at the centre of everything we do and Alexa is designed to provide accurate, relevant, and helpful information to customers,” an Amazon spokesperson said in a statement. “As soon as we became aware of this error, we took swift action to fix it.”

Voice assistants such as Google Assistant, Siri and Alexa get their information from common search engines, but do not have the ability to effectively check the information – and as such can provide false or offensive results.

In December 2020, Alexa was found to be repeating conspiratorial and racist remarks. Asked if Islam is evil, one result returned by Alexa was: “Here’s something I found on the web. According to [a website], Islam is an evil religion.”

In 2018, Apple’s Siri voice assistant thought that Donald Trump was a penis, due to someone vandalising the then US president’s Wikipedia page and Siri pulling the information from there.