Advertisement
UK markets closed
  • FTSE 100

    8,317.59
    -21.64 (-0.26%)
     
  • FTSE 250

    20,770.93
    +139.63 (+0.68%)
     
  • AIM

    810.02
    +5.00 (+0.62%)
     
  • GBP/EUR

    1.1737
    +0.0000 (+0.00%)
     
  • GBP/USD

    1.2739
    +0.0041 (+0.32%)
     
  • Bitcoin GBP

    54,486.89
    +1,567.83 (+2.96%)
     
  • CMC Crypto 200

    1,448.57
    -19.53 (-1.33%)
     
  • S&P 500

    5,304.72
    +36.88 (+0.70%)
     
  • DOW

    39,069.59
    +4.33 (+0.01%)
     
  • CRUDE OIL

    77.80
    +0.93 (+1.21%)
     
  • GOLD FUTURES

    2,335.20
    -2.00 (-0.09%)
     
  • NIKKEI 225

    38,646.11
    -457.11 (-1.17%)
     
  • HANG SENG

    18,608.94
    -259.77 (-1.38%)
     
  • DAX

    18,693.37
    +2.05 (+0.01%)
     
  • CAC 40

    8,094.97
    -7.36 (-0.09%)
     

AI being used to create child abuse imagery, watchdog warns

Thousands of AI-generated images depicting real victims of child sexual abuse threaten to “overwhelm” the internet, a watchdog has warned.

The Internet Watch Foundation (IWF), the UK organisation responsible for detecting and removing child sexual abuse imagery from the internet, said its “worst nightmares” have come true.

The IWF said criminals were now using the faces and bodies of real children who have appeared in confirmed abuse imagery to create new images of sexual abuse through artificial intelligence technology.

The data published by the organisation said the most convincing imagery would be difficult even for trained analysts to distinguish from actual photographs, and some content was now realistic enough to be treated as real imagery under UK law.

ADVERTISEMENT

The IWF warned that the technology was only improving and would pose more obstacles for watchdogs and law enforcement agencies to tackle the problem.

The research comes ahead of the UK hosting the AI safety summit next week, where world leaders and tech giants will discuss the developing issues around artificial intelligence.

In its latest research, the IWF said it had also found evidence of the commercialisation of AI-generated imagery, and warned that the technology was being used to “nudify” images of children whose clothed images had been uploaded online for legitimate reasons.

In addition, it said AI image tech was being used to create images of celebrities who had been “de-aged” and depicted as children in sexual abuse scenarios.

In a single month, the IWF said it investigated 11,108 AI images which had been shared on a dark web child abuse forum.

Of these, 2,978 were confirmed as images which breached UK law and 2,562 were so realistic it said they would need to be treated the same as if they were real abuse images.

Susie Hargreaves, chief executive of the IWF, said: “Our worst nightmares have come true. Earlier this year, we warned AI imagery could soon become indistinguishable from real pictures of children suffering sexual abuse, and that we could start to see this imagery proliferating in much greater numbers. We have now passed that point.

“Chillingly, we are seeing criminals deliberately training their AI on real victims’ images who have already suffered abuse.

“Children who have been raped in the past are now being incorporated into new scenarios because someone, somewhere, wants to see it.

“As if it is not enough for victims to know their abuse may be being shared in some dark corner of the internet, now they risk being confronted with new images, of themselves being abused in new and horrendous ways not previously imagined.

“This is not a hypothetical situation. We’re seeing this happening now. We’re seeing the numbers rise, and we have seen the sophistication and realism of this imagery reach new levels.

“International collaboration is vital. It is an urgent problem which needs action now. If we don’t get a grip on this threat, this material threatens to overwhelm the internet.”

The IWF said it feared that a deluge of AI-generated content could divert resources from detecting and removing real abuse, and in some instances could lead to missed opportunities to identify and safeguard real children.