UK markets close in 3 hours 44 minutes
  • FTSE 100

    -64.30 (-0.91%)
  • FTSE 250

    -135.36 (-0.59%)
  • AIM

    -4.15 (-0.33%)

    +0.0004 (+0.03%)

    +0.0006 (+0.04%)

    -790.53 (-2.77%)
  • CMC Crypto 200

    -8.64 (-0.93%)
  • S&P 500

    +18.51 (+0.42%)
  • DOW

    +153.60 (+0.44%)

    -0.16 (-0.22%)

    -4.40 (-0.24%)
  • NIKKEI 225

    -498.83 (-1.80%)

    -354.29 (-1.35%)
  • DAX

    -124.76 (-0.80%)
  • CAC 40

    -10.84 (-0.16%)

Facebook asks if you know someone ‘becoming an extremist’ in new prompt test

  • Oops!
    Something went wrong.
    Please try again later.
·2-min read
  • Oops!
    Something went wrong.
    Please try again later.
Facebook Antitrust Lawsuits (Copyright 2020 The Associated Press. All rights reserved)
Facebook Antitrust Lawsuits (Copyright 2020 The Associated Press. All rights reserved)

Facebook is testing a prompt asks users whether they are “concerned that someone you know is becoming an extremist”.

The new message says: “We care about preventing extremism on Facebook. Others in your situation have received confidential support.

“Hear stories and get advice from people who escaped violent extremist groups”. Underneath that message is a blue “Get Support” button.

Another version of the message reads: "Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others."

Speaking to CNN, Facebook said that this is part of a test the social media company is running as part of its Redirect Initiative, aimed at fighting extremism.

"This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,"Facebook spokesperson Andy Stone said.

"We are partnering with NGOs and academic experts in this space and hope to have more to share in the future." Facebook shared the same statement with The Independent but attributed it to an unnamed “Facebook company spokesperson”.

Facebook has often been criticised over claims of facilitating extremism on its platforms. A report by Avaaz, a nonprofit advocacy group that says it seeks to protect democracies from misinformation, claimed that Facebook allowed groups to glorify violence during the 2020 election and in the weeks leading up to the Capitol Hill insurrection attempt on 6 January.

Facebook’s algorithm also exacerbated divisiveness, according to leaked research from inside the social media company, as reported by the Wall Street Journal. Facebook reportedly ended research into stopping the platform being so polarising for fears that it would unfairly target right-wing users. “Our recommendation systems grow the problem,” one presentation said.

In response to that report, Facebook published a blog post saying that the newspaper "wilfully ignored critical facts that undermined its narrative" which, the company says, includes changes to the News Feed, limiting the reach of Pages and Groups that breach Facebook’s standards or share fake news, combating hate speech and misinformation, and "building a robust Integrity Team."

Read More

Facebook, TikTok, Google and Twitter vow to tackle ‘pandemic of online abuse against women’

Facebook valuation soars above $1 trillion for first time as judge rejects antitrust lawsuit

Companies can use fake Twitter testimonials for Facebook ads, despite policies that forbid ‘misleading claims’

Our goal is to create a safe and engaging place for users to connect over interests and passions. In order to improve our community experience, we are temporarily suspending article commenting