UK markets closed
  • FTSE 100

    6,915.75
    -26.47 (-0.38%)
     
  • FTSE 250

    22,251.26
    +3.72 (+0.02%)
     
  • AIM

    1,236.50
    -2.54 (-0.21%)
     
  • GBP/EUR

    1.1517
    -0.0005 (-0.05%)
     
  • GBP/USD

    1.3706
    -0.0029 (-0.21%)
     
  • BTC-GBP

    44,278.33
    +1,600.82 (+3.75%)
     
  • CMC Crypto 200

    1,235.89
    +8.34 (+0.68%)
     
  • S&P 500

    4,128.80
    +31.63 (+0.77%)
     
  • DOW

    33,800.60
    +297.03 (+0.89%)
     
  • CRUDE OIL

    59.34
    -0.26 (-0.44%)
     
  • GOLD FUTURES

    1,744.10
    -14.10 (-0.80%)
     
  • NIKKEI 225

    29,768.06
    +59.08 (+0.20%)
     
  • HANG SENG

    28,698.80
    -309.27 (-1.07%)
     
  • DAX

    15,234.16
    +31.48 (+0.21%)
     
  • CAC 40

    6,169.41
    +3.69 (+0.06%)
     

Scientists create online game to show risks of AI emotion recognition

Sam Russell, PA
·3-min read

Scientists are inviting people to pull faces at their webcam and smartphone to see in action a controversial technology called artificial intelligence emotion recognition.

Researchers from Cambridge University and UCL have built a website called Emojify to help people to understand how computers can be used to scan facial expressions to detect emotion.

Dr Alexa Hagerty, project lead and researcher at Cambridge’s Leverhulme Centre for the Future of Intelligence, said the technology, which is already used in parts of the world, is “powerful” but “flawed”.

Visitors to the website can play a game, pulling faces at their device’s camera to try to get the emotion recognition system to recognise six emotions – happiness, sadness, fear, surprise, disgust and anger.

Dr Alexa Hagerty plays a game on the Emojify website, which demonstrates AI emotion recognition technology. (Cambridge University/ PA)
Dr Alexa Hagerty plays a game on the Emojify website, which demonstrates AI emotion recognition technology. (Cambridge University/ PA)

They can also answer a series of optional questions to assist research, including whether they have experienced the technology before and if they think it is useful or concerning.

AI emotion recognition technology is in use across a variety of sectors in China including for police interrogation and to monitor behaviour in schools.

Other potential uses include in border control, assessing candidates during job interviews and for businesses to collect customer insights.

The researchers say they hope to start conversations about the technology and its social impacts.

Dr Hagerty said: “Many people are surprised to learn that emotion recognition technology exists and is already in use.

“Our project gives people a chance to experience these systems for themselves and get a better idea of how powerful they are, but also how flawed.”

The researchers say that they hope to start conversations about the technology and its social impacts. (Cambridge University/ PA)
The researchers say that they hope to start conversations about the technology and its social impacts. (Cambridge University/ PA)

Dr Igor Rubinov, of Dovetail Labs, a consultancy specialising in technology ethics, who directed the design of the interactive research website, said: “We want people to interact with an emotion recognition system and see how AI scans their faces and what it might get wrong.”

Juweek Adolphe, head designer, said: “It is meant to be fun but also to make you think about the stakes of this technology.”

Dr Hagerty said the technology has “worrying potential for discrimination and surveillance”.

She went on: “The science behind emotion recognition is shaky.

“It assumes that our facial expressions perfectly mirror our inner feelings.

“If you’ve ever faked a smile, you know that it isn’t always the case.”

Dr Alexandra Albert, of the Extreme Citizen Science (ExCiteS) research group at UCL, said a “more democratic approach” is needed to determine how the technology is used.

“There hasn’t been real public input or deliberation about these technologies,” she said.

“They scan your face, but it is tech companies who make the decisions about how they are used.”

The researchers said their website does not collect or save images or data from the emotion system.

The optional responses to questions will be used as part of an academic paper on citizen science approaches to better understand the societal implications of emotion recognition.

To try the artificial intelligence emotion recognition technology, see https://emojify.info/