Advertisement
UK markets closed
  • FTSE 100

    8,213.49
    +41.34 (+0.51%)
     
  • FTSE 250

    20,164.54
    +112.21 (+0.56%)
     
  • AIM

    771.53
    +3.42 (+0.45%)
     
  • GBP/EUR

    1.1652
    -0.0031 (-0.26%)
     
  • GBP/USD

    1.2546
    +0.0013 (+0.11%)
     
  • Bitcoin GBP

    50,274.73
    +178.91 (+0.36%)
     
  • CMC Crypto 200

    1,313.77
    +36.79 (+2.88%)
     
  • S&P 500

    5,127.79
    +63.59 (+1.26%)
     
  • DOW

    38,675.68
    +450.02 (+1.18%)
     
  • CRUDE OIL

    77.99
    -0.96 (-1.22%)
     
  • GOLD FUTURES

    2,310.10
    +0.50 (+0.02%)
     
  • NIKKEI 225

    38,236.07
    -37.98 (-0.10%)
     
  • HANG SENG

    18,475.92
    +268.79 (+1.48%)
     
  • DAX

    18,001.60
    +105.10 (+0.59%)
     
  • CAC 40

    7,957.57
    +42.92 (+0.54%)
     

Facebook rolls out fact-checking operation in UK

People protesting in Westminster last year during the appearance of a Facebook executive at an inquiry into fake news.
People protesting in Westminster last year during the appearance of a Facebook executive at an inquiry into fake news. Photograph: Victoria Jones/PA

Facebook’s fact-checking operation is launching in the UK, with the independent charity Full Fact selected to be the first British publisher to review and rate the accuracy of content on the social network.

Posts, links and videos that have been flagged as false will be marked as such to users, and people will be warned if a post they are about to share has been found to be false, but no one will be stopped from sharing or reading any content, false or not.

However, Facebook’s newsfeed algorithm does intervene to demote false content, ensuring that it reaches fewer people than it would otherwise.

ADVERTISEMENT

“People don’t want to see false news on Facebook, and nor do we,” said Sarah Brown, a training and news literacy manager for the company. “We’re delighted to be working with an organisation as reputable and respected as Full Fact to tackle this issue. By combining technology with the expertise of our fact-checking partners, we’re working continuously to reduce the spread of misinformation on our platform.”

Since its launch in the US, Facebook’s fact-checking programme has received mixed reviews. It has been praised for trying to tackle the spread of misinformation on the platform, and particularly for its decision to give fact-checkers’ findings real weight in its algorithmic promotion. However, it has been criticised for its unwillingness to pay for fact-checking, which relies on users to flag content to third parties, who then check the veracity of factual claims.

More general concerns have also been raised about the effectiveness of the programme: the worst falsehoods often propagate faster than fact-checkers can keep up, and increasing evidence suggests the labelling aspect of the fact-checking – as opposed to the algorithmic tweaks – can serve to promote, rather than suppress, false claims, as readers take the label as proof of a partisan effort at censorship.

Facebook has also faced criticism over its choice of fact-checking partners in the US, supplementing neutral authorities, such as Snopes and the Associated Press, with partisan outlets, such as the conservative publication Weekly Standard, leading to conflicts about the truthfulness of fact checks.

In the UK, Full Fact will initially be the sole fact-checking partner. Will Moy, the charity’s director, welcomed Facebook’s decision, saying: “Fact-checking can take hours, days or weeks, so nobody has time to properly check everything they see online. But it’s important somebody’s doing it because online misinformation, at its worst, can seriously damage people’s safety or health.

“There’s no magic pill to instantly cure the problem, but this is a step in the right direction. It will let us give Facebook users the information they need to scrutinise false or misleading stories themselves and hopefully limit their spread – without stopping them sharing anything they want to.”

A study published in the journal Science recently revealed that Facebook users aged over 65 shared more than seven times as much fake news content as users aged between 18 and 29.