Advertisement
UK markets closed
  • FTSE 100

    8,433.76
    +52.41 (+0.63%)
     
  • FTSE 250

    20,645.38
    +114.08 (+0.56%)
     
  • AIM

    789.87
    +6.17 (+0.79%)
     
  • GBP/EUR

    1.1622
    +0.0011 (+0.09%)
     
  • GBP/USD

    1.2525
    +0.0001 (+0.01%)
     
  • Bitcoin GBP

    48,657.12
    -1,580.58 (-3.15%)
     
  • CMC Crypto 200

    1,260.76
    -97.25 (-7.16%)
     
  • S&P 500

    5,222.68
    +8.60 (+0.16%)
     
  • DOW

    39,512.84
    +125.08 (+0.32%)
     
  • CRUDE OIL

    78.20
    -1.06 (-1.34%)
     
  • GOLD FUTURES

    2,366.90
    +26.60 (+1.14%)
     
  • NIKKEI 225

    38,229.11
    +155.13 (+0.41%)
     
  • HANG SENG

    18,963.68
    +425.87 (+2.30%)
     
  • DAX

    18,772.85
    +86.25 (+0.46%)
     
  • CAC 40

    8,219.14
    +31.49 (+0.38%)
     

Meta asks if it can let people post coronavirus misinformation on Facebook and Instagram

 (Getty Images)
(Getty Images)

Meta has asked its oversight board whether its measures against coronavirus misinformation should stay in place.

The company, which owns Facebook, Instagram, and WhatsApp, initially only removed misinformation when local partners with relevant expertise told it a particular piece of content (like a specific post on Facebook) could contribute to a risk of imminent physical harm.

Eventually, its policies were expanded to remove entire categories of false claims on a worldwide scale

Now, however, the company has asked the board - which has 20 members including politicians, lawyers, and academics and is funded by a $130m trust from the social media giant - whether it should “address this misinformation through other means, like labeling or demoting it either directly or through our third-party fact-checking program.”

ADVERTISEMENT

In general, Meta’s policies of removing content had mixed results due to its questionable effectiveness.

Researchers running experiments on the platform found that two brand-new accounts they had set up were recommended 109 pages containing anti-vaccine information in just two days.

Now, however, Meta’s president of global affairs and former UK deputy prime minister Nick Clegg says that “life is increasingly returning to normal” in some countries.

“This isn’t the case everywhere and the course of the pandemic will continue to vary significantly around the globe — especially in countries with low vaccination rates and less developed healthcare systems. It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in.”

Meta is asking for guidance because “resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic”, he wrote.

During the pandemic, Meta’s head of virtual reality Andrew Bozworth said that "individual humans are the ones who choose to believe or not believe a thing. They are the ones who choose to share or not share a thing," adding that he did not "feel comfortable at all saying they don’t have a voice because I don’t like what they said."

He went on: “If your democracy can’t tolerate the speech of people, I’m not sure what kind of democracy it is. [Facebook is] a fundamentally democratic technology”.

study conducted by the non-profit Centre for Countering Digital Hate and Anti-Vax Watch suggested that close to 65 per cent of the vaccine-related misinformation on Facebook was coming from 12 people. Researchers also said that recommendation algorithms were at the heart of the problem, which are still generally designed to boost content that engages the most people, regardless of what it is - even conspiracy theories.

“For a long time the companies tolerated that because they were like, ‘Who cares if the Earth is flat, who cares if you believe in chemtrails?’ It seemed harmless,” said Hany Farid, a misinformation researcher and professor at the University of California at Berkeley.

“The problem with these conspiracy theories that maybe seemed goofy and harmless is they have led to a general mistrust of governments, institutions, scientists and media, and that has set the stage of what we are seeing now.”

In a statement, the Center for Countering Digital Hate, said that Meta’s request to its oversight board was “designed to distract from Meta’s failure to act on a flood of anti-vaccine conspiracy theories spread by opportunistic liars” during the coronavirus pandemic.

“CCDH’s research, as well as Meta’s own internal analysis, shows that the majority of anti-vaccine misinformation originates from a tiny number of highly prolific bad actors. But Meta has failed to act on key figures who are still reaching millions of followers on Facebook and Instagram”, Callum Hood, head of research at the CCDH, said.

“Platforms like Meta should not have absolute power over life-and-death issues like this that affect billions of people. It’s time people in the UK and elsewhere are given democratic oversight of life-changing decisions made thousands of miles away in Silicon Valley.”