Advertisement
UK markets closed
  • FTSE 100

    8,433.76
    +52.41 (+0.63%)
     
  • FTSE 250

    20,645.38
    +114.08 (+0.56%)
     
  • AIM

    789.87
    +6.17 (+0.79%)
     
  • GBP/EUR

    1.1622
    +0.0011 (+0.09%)
     
  • GBP/USD

    1.2525
    +0.0001 (+0.01%)
     
  • Bitcoin GBP

    48,628.41
    -1,652.68 (-3.29%)
     
  • CMC Crypto 200

    1,261.93
    -96.08 (-7.08%)
     
  • S&P 500

    5,222.68
    +8.60 (+0.16%)
     
  • DOW

    39,512.84
    +125.08 (+0.32%)
     
  • CRUDE OIL

    78.20
    -1.06 (-1.34%)
     
  • GOLD FUTURES

    2,366.90
    +26.60 (+1.14%)
     
  • NIKKEI 225

    38,229.11
    +155.13 (+0.41%)
     
  • HANG SENG

    18,963.68
    +425.87 (+2.30%)
     
  • DAX

    18,772.85
    +86.25 (+0.46%)
     
  • CAC 40

    8,219.14
    +31.49 (+0.38%)
     

Meta taskforce to combat trade of child sex abuse materials after damning report

<span>Photograph: Peter Byrne/PA</span>
Photograph: Peter Byrne/PA

Mark Zuckerberg’s Meta has set up a taskforce to investigate claims that Instagram is hosting the distribution and sale of self-generated child sexual abuse material, with the platform’s algorithms helping advertise illicit content.

The move by the Facebook parent comes after a report from the Stanford Internet Observatory (SIO) that found a web of social media accounts, which appear to be operated by minors, advertising self-generated child sexual abuse material (SG-CSAM).

The SIO said that Instagram is “currently the most important platform for these networks” with enabling features such as recommendation algorithms and direct messaging that connects buyers and sellers of SG-CSAM.

ADVERTISEMENT

Related: How Facebook and Instagram became marketplaces for child sex trafficking

The SIO said it acted on a tip from the Wall Street Journal, which detailed Instagram’s SG-CSAM problems, along with the SIO’s findings, in an investigation published on Wednesday.

The SIO reported that Instagram has allowed users to search for terms that its own algorithms know could be linked to SG-CSAM, with a pop-up screen for users warning that “these results may contain images of child sexual abuse”. The screen gives users the option to “see results anyway”. Instagram has removed the option for users to view the content after being contacted by the Journal.

In a statement, a Meta spokesperson said the company had set up an internal taskforce to deal with the claims in the reports.

“We’re continuously exploring ways to actively defend against this behaviour, and we set up an internal task force to investigate these claims and immediately address them,” said the spokesperson.

The SIO report follows a Guardian investigation in April that revealed how Meta is failing to report or detect the use of Facebook and Instagram for child trafficking. In response to the Guardian’s allegations at the time, a Meta spokesperson said: “The exploitation of children is a horrific crime – we don’t allow it and we work aggressively to fight it on and off our platforms.”

The platform’s recommendation algorithms effectively advertise [self-generated child sexual abuse materials]

The Stanford Internet Observatory

The SIO said its investigation found that large networks of social media accounts are openly advertising self-generated child sexual abuse material. It said Instagram’s popularity and “user-friendly interface” made it a preferred option among platforms.

“The platform’s recommendation algorithms effectively advertise SG-CSAM: these algorithms analyze user behaviours and content consumption to suggest related content and accounts to follow,” said the SIO.

The report said SG-CSAM can sometimes be distributed voluntarily but then become widely distributed publicly. It can also overlap with non-consensual intimate imagery, also referred to as “revenge porn”, while minors can also be coerced into producing sexual content. The SIO added that in recent years SG-CSAM has increasingly become a commercial venture including the posting of content “menus” online.

Researchers said they looked at one network in particular in which there were 405 accounts advertising the sale of SG-CSAM on Instagram as well as 128 seller accounts on Twitter. They said 58 accounts within the Instagram follower network appeared to be content buyers. The accounts were referred to the National Center for Missing and Exploited Children (NCMEC), which processes reports of online sexual child exploitation from US tech platforms. The SIO report said one month after they were reported to the NCMEC, 31 of the Instagram seller accounts were still active, along with 28 of the likely buyer accounts. On Twitter, 22 out of the 128 accounts identified in the report were still active. Twitter has been contacted for comment.

Meta said it had already addressed some of the investigation findings, saying in a statement it had fixed a technical issue that prevented reports of SG-CSAM from reaching content viewers and updating guidance to content reviews about identifying and removing predatory accounts. The Journal reported that an anti-paedophile activist was told by Instagram that one image of a scantily clad girl with a graphically sexual caption “does not go against our Community Guidelines” and was told to hide the account in order to avoid seeing its content.

Meta said in its statement it had also removed “thousands” of SG-CSAM-related search terms and hashtags on Instagram after researchers at the SIO found that paedophiles were searching under terms such as #pedobait and variations on #mnsfw (“minor not safe for work”).

Meta added that between 2020 and 2022 it had dismantled 27 abusive networks while in January this year it had disabled more than 490,000 accounts for violating its child safety policies.

The SIO report said industry-wide action is needed to tackle the problem.