Advertisement
UK markets open in 32 minutes
  • NIKKEI 225

    38,835.10
    +599.03 (+1.57%)
     
  • HANG SENG

    18,473.85
    -104.45 (-0.56%)
     
  • CRUDE OIL

    78.59
    +0.11 (+0.14%)
     
  • GOLD FUTURES

    2,328.40
    -2.80 (-0.12%)
     
  • DOW

    38,852.27
    +176.59 (+0.46%)
     
  • Bitcoin GBP

    50,670.36
    -662.95 (-1.29%)
     
  • CMC Crypto 200

    1,364.61
    -0.52 (-0.04%)
     
  • NASDAQ Composite

    16,349.25
    +192.92 (+1.19%)
     
  • UK FTSE All Share

    4,469.09
    +22.94 (+0.52%)
     

Facebook asks if you know someone ‘becoming an extremist’ in new prompt test

Facebook Antitrust Lawsuits (Copyright 2020 The Associated Press. All rights reserved)
Facebook Antitrust Lawsuits (Copyright 2020 The Associated Press. All rights reserved)

Facebook is testing a prompt asks users whether they are “concerned that someone you know is becoming an extremist”.

The new message says: “We care about preventing extremism on Facebook. Others in your situation have received confidential support.

“Hear stories and get advice from people who escaped violent extremist groups”. Underneath that message is a blue “Get Support” button.

Another version of the message reads: "Violent groups try to manipulate your anger and disappointment. You can take action now to protect yourself and others."

Speaking to CNN, Facebook said that this is part of a test the social media company is running as part of its Redirect Initiative, aimed at fighting extremism.

ADVERTISEMENT

"This test is part of our larger work to assess ways to provide resources and support to people on Facebook who may have engaged with or were exposed to extremist content, or may know someone who is at risk,"Facebook spokesperson Andy Stone said.

"We are partnering with NGOs and academic experts in this space and hope to have more to share in the future." Facebook shared the same statement with The Independent but attributed it to an unnamed “Facebook company spokesperson”.

Facebook has often been criticised over claims of facilitating extremism on its platforms. A report by Avaaz, a nonprofit advocacy group that says it seeks to protect democracies from misinformation, claimed that Facebook allowed groups to glorify violence during the 2020 election and in the weeks leading up to the Capitol Hill insurrection attempt on 6 January.

Facebook’s algorithm also exacerbated divisiveness, according to leaked research from inside the social media company, as reported by the Wall Street Journal. Facebook reportedly ended research into stopping the platform being so polarising for fears that it would unfairly target right-wing users. “Our recommendation systems grow the problem,” one presentation said.

In response to that report, Facebook published a blog post saying that the newspaper "wilfully ignored critical facts that undermined its narrative" which, the company says, includes changes to the News Feed, limiting the reach of Pages and Groups that breach Facebook’s standards or share fake news, combating hate speech and misinformation, and "building a robust Integrity Team."

Read More

Facebook, TikTok, Google and Twitter vow to tackle ‘pandemic of online abuse against women’

Facebook valuation soars above $1 trillion for first time as judge rejects antitrust lawsuit

Companies can use fake Twitter testimonials for Facebook ads, despite policies that forbid ‘misleading claims’