- Oops!Something went wrong.Please try again later.
Facebook has defended a system that shields millions of celebrities, politicians and other high profile users from some of the company’s content moderation practices, under a programme called “Cross Check”.
The programme was initially launched several years ago as a “quality control measure for actions taken against high profile accounts”, but now effectively grants immunity to VIP users until complaints can be reviewed by an employee, according to a new investigation by the Wall Street Journal.
While a post by an everyday Facebook user that contained harassment, incitement to violence or misinformation would be immediately taken down or invite sanctions if flagged, the Cross Check programme - also known as “X Check” protects public figures from this stage of moderation.
Among the prominent people who were “whitelisted” under the programme include the now-suspended account of former US president Donald Trump, former US secretary of state Hillary Clinton, senator Elizabeth Warren and Brazilian footballer Neymar da Silva Santos Jr, reported WSJ.
Facebook has since responded to the report, with spokesperson Andy Stone reiterating the position taken by the company in a 2018 blog post about Cross Check.
Mr Stone wrote on Twitter that: “‘Cross-check’ simply means that some content from certain Pages or Profiles is given a second layer of review to make sure we’ve applied our policies correctly.
“There aren’t two systems of justice; it’s an attempted safeguard against mistakes,” he said in the series of tweets. “We know our enforcement is not perfect and there are tradeoffs between speed and accuracy.”
Citing instances of Cross Check exemptions, the WSJ report said that in 2019, Neymar had shared WhatsApp messages including the name and nude pictures of a woman who had accused him of rape. In the post, he accused the woman of extorting him.
While Facebook’s policy on sharing “nonconsensual intimate imagery” is to remove them using artificial intelligence, Neymar’s post remained live for more than a day before it was ultimately taken down, said the WSJ report. An internal review found that it reached more than 56 million Facebook and Instagram users, including the video being reposted more than 6,000 times.
The report added that while Facebook’s guidelines stipulate deletion of posts containing unauthorised nudes as well as the account of the person who posted them, the review decided “to leave Neymar’s accounts active, a departure from our usual ‘one strike’ profile disable policy”.
While Neymar denied the allegations, the woman was charged with slander, extortion and fraud. The first two charges were dropped and she was acquitted in the fraud case. Neymar’s spokesperson told the WSJ that he adheres to Facebook’s policies while declining to comment further.
An internal review of the programme slammed the practice of whitelisting. Calling it a “breach of trust”, a 2019 confidential review seen by the WSJ said: “Unlike the rest of our community, these people can violate our standards without any consequences.”
Citing the internal documents, the report said that now the number of users receiving protection under this program had grown to include 5.8 million users by 2020.
Facebook’s post from 2018 said the protection of this programme “typically applies to high profile, regularly visited Pages or pieces of content on Facebook so that they are not mistakenly removed or left up.” These also include media organisations, including Channel 4, the BBC and The Verge.
The 2018 statement said that the programme also double-checks content by celebrities, governments, or pages to avoid mistakenly deleting posts aimed at raising awareness of hate speech.
“To be clear, Cross Checking something on Facebook does not protect the profile, Page, or content from being removed. It is simply done to make sure our decision is correct,” Facebook said at that time.
Facebook’s Oversight Board spokesperson John Taylor told AFP that the board shared its unease over the lack of transparency in the program.
"The Oversight Board has expressed on multiple occasions its concern about the lack of transparency in Facebook’s content moderation processes, especially relating to the company’s inconsistent management of high-profile accounts."