Facebook parent company Meta has asked its Oversight Board to advise on whether its current Covid-19 misinformation policy is still appropriate now that it says the pandemic’s status has “evolved”.
Sir Nick Clegg, Meta’s president of global affairs, said the company wanted guidance on whether its broad measures to remove misinformation linked to the virus that were introduced in the early days of the pandemic were still relevant and proportionate as many places “seek to return to more normal life”.
The Oversight Board was set up in 2020 and is able to make binding decisions about Facebook’s content removal actions and policies, even overruling the platform and executives.
The former Liberal Democrat leader and deputy prime minister said the tighter measures to stop the spread of misinformation were vital earlier in the Covid-19 outbreak, but the social networking firm felt the time was now right to ask whether it “remains the right approach for the months and years ahead”.
“The world has changed considerably since 2020,” Sir Nick wrote in a blog post, adding that a number of countries had high vaccination rates, while online tools and resources to identify and remove misinformation, as well as educate people on its dangers, were now widespread.
But he acknowledged that this was not the case everywhere, and the company now wanted guidance on how to best approach keeping people safe from harmful content while still protecting freedom of expression.
“It is important that any policy Meta implements be appropriate for the full range of circumstances countries find themselves in,” he said.
“Meta is fundamentally committed to free expression and we believe our apps are an important way for people to make their voices heard.
“But some misinformation can lead to an imminent risk of physical harm, and we have a responsibility not to let this content proliferate. The policies in our Community Standards seek to protect free expression while preventing this dangerous content.
“But resolving the inherent tensions between free expression and safety isn’t easy, especially when confronted with unprecedented and fast-moving challenges, as we have been in the pandemic.
“That’s why we are seeking the advice of the Oversight Board in this case. Its guidance will also help us respond to future public health emergencies.”
However, online safety campaigners have accused Meta of trying to deflect from what they said was a failure to prevent large amounts of misinformation to be spread during the pandemic.
Callum Hood, head of research at the Centre for Countering Digital Hate (CCDH), said: “This move is designed to distract from Meta’s failure to act on a flood of anti-vaccine conspiracy theories spread by opportunistic liars during the pandemic – many of whom made millions of dollars by exploiting social media’s massive audience and algorithmic amplification.
“CCDH’s research, as well as Meta’s own internal analysis, shows that the majority of anti-vaccine misinformation originates from a tiny number of highly prolific bad actors.
“But Meta has failed to act on key figures who are still reaching millions of followers on Facebook and Instagram.
“Platforms like Meta should not have absolute power over life-and-death issues like this that affect billions of people. It’s time people in the UK and elsewhere are given democratic oversight of life-changing decisions made thousands of miles away in Silicon Valley.”
Sir Nick said Meta’s policies had helped remove Covid-19 misinformation on an “unprecedented scale”, saying more than 25 million pieces of content had been removed globally since the start of the pandemic.