Advertisement
UK markets closed
  • FTSE 100

    8,433.76
    +52.41 (+0.63%)
     
  • FTSE 250

    20,645.38
    +114.08 (+0.56%)
     
  • AIM

    789.87
    +6.17 (+0.79%)
     
  • GBP/EUR

    1.1622
    +0.0011 (+0.09%)
     
  • GBP/USD

    1.2525
    +0.0001 (+0.01%)
     
  • Bitcoin GBP

    48,632.49
    -1,643.56 (-3.27%)
     
  • CMC Crypto 200

    1,261.04
    -96.97 (-7.14%)
     
  • S&P 500

    5,222.68
    +8.60 (+0.16%)
     
  • DOW

    39,512.84
    +125.08 (+0.32%)
     
  • CRUDE OIL

    78.20
    -1.06 (-1.34%)
     
  • GOLD FUTURES

    2,366.90
    +26.60 (+1.14%)
     
  • NIKKEI 225

    38,229.11
    +155.13 (+0.41%)
     
  • HANG SENG

    18,963.68
    +425.87 (+2.30%)
     
  • DAX

    18,772.85
    +86.25 (+0.46%)
     
  • CAC 40

    8,219.14
    +31.49 (+0.38%)
     

Human brain might subconsciously be able to detect deepfakes, study suggests

Human brain might subconsciously be able to detect deepfakes, study suggests

The human brain could have the ability to subconciously detect deepfakes, suggests a new study that may lead to the creation of tools for curbing the spread of disinformation.

Deepfakes are videos, images, audio, or text that appear to be authentic, but are computer-generated clones designed to mislead and sway public opinion.

Subjects tried to detect deepfakes and were assessed using electroencephalography (EEG) brain scans, said the study, published recently in the journal Vision Research.

The brains of these individuals could successfully detect deepfakes about 54 per cent of the time, said scientists, including those the University of Sydney in Australia.

ADVERTISEMENT

However, when an earlier group was asked to verbally identify the same deepfakes, their success rate was only 37 per cent.

“Although the brain accuracy rate in this study is low – 54 per cent – it is statistically reliable. That tells us the brain can spot the difference between deepfakes and authentic images,” said study co-author Thomas Carlson from the University of Sydney.

There are now a growing number of deepfake videos online – from non-consensual explicit content to doctored media used in disinformation campaigns by foreign adversaries.

For instance, at the beginning of the Russian invasion of Ukraine, a deepfake video of president Volodymyr Zelensky urging his troops to surrender to Russian forces surfaced on social media.

With scientists across the world attempt to find new ways to identify deepfakes, researchers behind the new study said their findings could be a springboard in the fight against such doctored content online.

“If we can learn how the brain spots deepfakes, we could use this information to create algorithms to flag potential deepfakes on digital platforms like Facebook and Twitter,” Dr Carlson said.

In the new study, participants were shown 50 images of real and computer-generated fake faces and asked them to identify which was which.

They showed a different group of participants the same images while their brain activity was recorded using EEGs, without them knowing half the images were fakes.

Comparing the two findings, scientists found people’s brains were better at detecting deepfakes than their eyes.

But scientists have cautioned that the findings are “just a starting point” and said further validation of the results is needed.

“More research must be done. What gives us hope is that deepfakes are created by computer programs, and these programs leave ‘fingerprints’ that can be detected,” Dr Carlson said.