UK markets close in 3 hours 53 minutes
  • FTSE 100

    5,620.57
    -108.42 (-1.89%)
     
  • FTSE 250

    17,316.71
    -271.00 (-1.54%)
     
  • AIM

    958.34
    -14.19 (-1.46%)
     
  • GBP/EUR

    1.1030
    -0.0031 (-0.28%)
     
  • GBP/USD

    1.2933
    -0.0108 (-0.83%)
     
  • BTC-GBP

    10,259.83
    -261.12 (-2.48%)
     
  • CMC Crypto 200

    262.71
    +1.42 (+0.54%)
     
  • S&P 500

    3,390.68
    -10.29 (-0.30%)
     
  • DOW

    27,463.19
    -222.19 (-0.80%)
     
  • CRUDE OIL

    37.79
    -1.78 (-4.50%)
     
  • GOLD FUTURES

    1,882.40
    -29.50 (-1.54%)
     
  • NIKKEI 225

    23,418.51
    -67.29 (-0.29%)
     
  • HANG SENG

    24,708.80
    -78.39 (-0.32%)
     
  • DAX

    11,635.88
    -427.69 (-3.55%)
     
  • CAC 40

    4,575.47
    -155.19 (-3.28%)
     

Facebook says it will ban QAnon across its platforms

Alex Wilhelm and Taylor Hatmaker
·3-min read
NEW YORK, NY - OCTOBER 03: A person wears a QAnon sweatshirt during a pro-Trump rally on October 3, 2020 in the borough of Staten Island in New York City. The event, which was organized weeks ago, encouraged people to vote Republican and to pray for the health of President Trump who fell ill with Covid-19. (Photo by Stephanie Keith/Getty Images)
NEW YORK, NY - OCTOBER 03: A person wears a QAnon sweatshirt during a pro-Trump rally on October 3, 2020 in the borough of Staten Island in New York City. The event, which was organized weeks ago, encouraged people to vote Republican and to pray for the health of President Trump who fell ill with Covid-19. (Photo by Stephanie Keith/Getty Images)

Facebook expanded a ban on QAnon-related content on its various social platforms Tuesday, deepening a previous prohibition on QAnon-related groups that had "discussed potential violence," according to the company.

Today's move by Facebook to not only ban violent QAnon content but "any Facebook Pages, Groups and Instagram accounts representing QAnon" is an escalation by the social giant to clean its platform ahead of an increasingly contentious election.

QAnon is a sprawling set of interwoven pro-Trump conspiracy theories that has taken root inside swaths of the American electorate. Its more extreme adherents have been charged with terrorism after acting out in violent and dangerous ways, spurred on by their adherence to the unusual and often incoherent belief system. BuzzFeed News recently decided to call QAnon a "collective delusion," another apt title for the theory's inane, fatuous and dangerous beliefs.

Facebook's effort to rein in QAnon is helpful, but likely too late. Over the course of the last year, QAnon swelled from a fringe conspiracy theory into a shockingly mainstream political belief system — one that even has its own Congressional candidates. That growth was powered by social networks inherently designed to connect like-minded people to one another, a feature that has been found time and time again to spread misinformation and usher users toward increasingly radical beliefs.

In July, Twitter took action of its own against QAnon, citing concerns about "offline harm." The company downranked QAnon content, removing it from trending pages and algorithmic suggestions. Twitter's policy change, like Facebook's previous one, stopped short of banning the content outright but did move to contain its spread.

Other companies, like Alphabet's YouTube product, have come under similar censure by external observers. (YouTube says it reworked its algorithm to better filter out the darker shores of its content mix, but the results of that experiment are far from conclusive.)

Social platforms like Facebook and Twitter have also made changes to their rules after being confronted with a willfully mendacious administration ahead of an election, about which the same administration has propagated lies and disinformation about voting security and the virus that has killed more than 200,000 Americans. The pairs' work to limit those two particularly risky strains of misinformation is worthy, but by taking a reactive posture instead of a proactive one most of those policy choices have also come too late to control the viral spread of dangerous content.

Facebook's new rule comes into force today, with the company saying in a release that it is now "removing content accordingly," but that the effort to purge QAnon "will take time."

What drove the change at Facebook? According to the company, after it yanked violent QAnon material, it saw "other QAnon content tied to different forms of real world harm, including recent claims that the west coast wildfires were started by certain groups." In Oregon, where forest fires recently raged, misinformation on the Facebook platform led to misinformed state residents who believed that antifa — a term applied to those opposed to fascism as an unironic pejorative — were torching the state. Seeing the unfounded rumors, misinformed residents set up illegal roadblocks and interrogated people passing through their areas.

How effective Facebook will be at clearing QAnon-related content from its various platforms is not clear today, but will be something that we'll track.