Facebook has reversed its policy banning posts suggesting Covid-19 emerged from a laboratory amid renewed debate over the origins of the virus, raising fresh questions about social media's role in policing misinformation.
The latest move by Facebook, announced late Wednesday on its website, highlights the challenge for the world's largest social network of rooting out false and potentially harmful content while remaining open for discourse.
"In light of ongoing investigations into the origin of Covid-19 and in consultation with public health experts, we will no longer remove the claim that Covid-19 is man-made or manufactured from our apps," the statement said.
"We're continuing to work with health experts to keep pace with the evolving nature of the pandemic and regularly update our policies as new facts and trends emerge."
The new statement updates guidance from Facebook in February when it said it would remove false or debunked claims about the novel coronavirus which created a global pandemic killing more than three million.
The move followed President Joe Biden's directive to US intelligence agencies to investigate competing theories on how the virus first emerged -- through animal contact at a market in Wuhan, China, or through accidental release from a research laboratory in the same city.
Biden's order signals an escalation in mounting controversy over the origins of the virus.
The natural origin hypothesis holds that it emerged in bats then passed to humans, likely via an intermediary species.
This theory was widely accepted at the start of the pandemic, but as time has worn on, scientists have not found a virus in either bats or another animal that matches the genetic signature of SARS-CoV-2.
The lab-leak theory, meanwhile, is gaining increasing traction in the United States, where it was initially fueled by former president Donald Trump and his aides and dismissed by many as a political talking point.
A recent Wall Street Journal report, citing US intelligence findings, said three researchers from China's Wuhan Institute of Virology became sick in November 2019, a month before Beijing disclosed the existence of a mysterious pneumonia outbreak.
- Pushback from the right -
Facebook's move, which could impact what some three billion users of its family of apps see, highlights the controversy over social media's aggressive efforts to root out misinformation on topics where facts may be evolving.
The reversal may be "another exhibit for the possibility that there will be a swing back against the more heavy-handed moderation," tweeted Evelyn Douek, a Harvard University lecturer and researcher of online speech regulation.
"When the pandemic started, there were many arguments that 'what platforms are doing for health misinfo, they should do for all misinfo all the time.' It was over-simplified then, and strikes me as untenable now."
Facebook uses independent third-party fact checkers, including AFP, to debunk misinformation. Although the origins of the virus remain unproven, the lab leak theory has been subject to fact-checking.
One fact checking organization, PolitiFact, reported last September that public health authorities had "repeatedly said the coronavirus was not derived from a lab" but earlier this month revised its guidance, noting: "that assertion is now more widely disputed," and saying it would continue to review the matter.
The abrupt Facebook reversal prompted angry responses from conservatives and Trump supporters.
"Wow! But they did suppress the story for a year, defaming Trump and Republicans for a 'conspiracy theory' blacklisting conservative press and banning us," tweeted Kelly Sadler, a blogger and former Trump aide.
But Rebekah Tromble, director of Institute for Data, Democracy & Politics at George Washington University, said Facebook "is doing the right thing" by updating its guidance.
"Information changes over time, and responsible organizations -- social media outlets and fact-checkers alike -- make decisions based on the best information available but remain open and willing to change their evaluations as new information arises," Tromble told AFP.
"Facebook will undoubtedly receive blowback for this decision, as will fact-checkers. But that blowback will come from the same people and groups that have always been critical."
Facebook in a separate statement said it was stepping up its efforts to curb misinformation by limiting the reach of users who "repeatedly" share false content.
Until now, Facebook had only taken this action on individual posts, but now will clamp down on the users who are the largest spreaders of false content.