Advertisement
UK markets closed
  • FTSE 100

    7,895.85
    +18.80 (+0.24%)
     
  • FTSE 250

    19,391.30
    -59.37 (-0.31%)
     
  • AIM

    745.67
    +0.38 (+0.05%)
     
  • GBP/EUR

    1.1612
    -0.0071 (-0.61%)
     
  • GBP/USD

    1.2376
    -0.0062 (-0.50%)
     
  • Bitcoin GBP

    52,105.02
    +1,006.48 (+1.97%)
     
  • CMC Crypto 200

    1,378.59
    +65.97 (+5.03%)
     
  • S&P 500

    4,992.47
    -18.65 (-0.37%)
     
  • DOW

    38,053.75
    +278.37 (+0.74%)
     
  • CRUDE OIL

    83.54
    +0.81 (+0.98%)
     
  • GOLD FUTURES

    2,412.40
    +14.40 (+0.60%)
     
  • NIKKEI 225

    37,068.35
    -1,011.35 (-2.66%)
     
  • HANG SENG

    16,224.14
    -161.73 (-0.99%)
     
  • DAX

    17,737.36
    -100.04 (-0.56%)
     
  • CAC 40

    8,022.41
    -0.85 (-0.01%)
     

'Unlikely' string of events sees Amazon Alexa go rogue

An American woman says she feels "invaded" after an Amazon Alexa device recorded a private conversation and sent it to a random contact without being asked to.

US news outlet KIRO 7 reported that a woman, identified only as Danielle from Portland, Oregon, had been unaware of what happened until she received a phone call from her husband's employee.

The employee said that Alexa, Amazon's popular voice assistant, had recorded the family's conversation and sent it to him.

Luckily, the conversation was not too personal - it was about hardwood floors.

Nonetheless, Danielle said she felt "invaded".

ADVERTISEMENT

She added: "Immediately I said: 'I'm never plugging that device in again, because I can't trust it'."

Amazon confirmed the woman's conversation had been inadvertently recorded and sent, blaming an "unlikely" string of events for the error.

Alexa starts recording after hearing its name or another "wake word" chosen by users, meaning that even having a TV switched on can result in the device being activated.

Amazon said this was what happened to Danielle, adding: "The subsequent conversation was heard as a 'send message' request.

"At which point, Alexa said out loud 'To whom?' At which point, the background conversation was interpreted as a name in the customer's contact list.

"We are evaluating options to make this case even less likely."

Amazon wants Alexa to become a popular home accessory, used for everything from dimming the lights to ordering a pizza, but to achieve this, it must be able to assure users of the device's security.

There were fears raised after US researchers found in 2016 that sounds unintelligible to humans could set off voice assistants.

According to The New York Times, the group showed that they could hide commands in white noise played over loudspeakers and through YouTube videos to get smart devices to turn on flight mode or open a website.

In May, some of those researchers went further, saying they could embed commands directly into recordings of music or spoken text.

This would mean that, while a human listener hears an orchestra, the voice assistant might hear an instruction to add something to your shopping list.