Advertisement
UK markets closed
  • FTSE 100

    7,952.62
    +20.64 (+0.26%)
     
  • FTSE 250

    19,884.73
    +74.07 (+0.37%)
     
  • AIM

    743.26
    +1.15 (+0.15%)
     
  • GBP/EUR

    1.1696
    +0.0003 (+0.02%)
     
  • GBP/USD

    1.2622
    -0.0016 (-0.13%)
     
  • Bitcoin GBP

    56,108.63
    +1,043.36 (+1.89%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • S&P 500

    5,254.35
    +5.86 (+0.11%)
     
  • DOW

    39,807.37
    +47.29 (+0.12%)
     
  • CRUDE OIL

    83.11
    -0.06 (-0.07%)
     
  • GOLD FUTURES

    2,254.80
    +16.40 (+0.73%)
     
  • NIKKEI 225

    40,168.07
    -594.66 (-1.46%)
     
  • HANG SENG

    16,541.42
    +148.58 (+0.91%)
     
  • DAX

    18,492.49
    +15.40 (+0.08%)
     
  • CAC 40

    8,205.81
    +1.00 (+0.01%)
     

Inside Facebook’s election 'war room' and its battle to take down fake news

Facebook's election
Facebook's election

A clock on the wall counts down the hours and minutes until zero day. Clusters of monitors display live television feeds and data dashboards. Headshots of important people are pinned to a noticeboard, alongside a map of the battle zone.

It could be a military command post or a police incident centre, but in fact this is Facebook's election "war room": a hastily assembled new facility at its California headquarters where 20-40 employees keep a watchful eye over the world's democratic elections.

Here, analysts and engineers sit, stand or confer with each other in an atmosphere of studious hush, watching for signs of malicious activity. One wall shows a silent video feed from another Facebook office in Washington DC, ready to become a conference call at a moment's notice; the opposite wall bears information about Brazil's general election, which will enter its second round next Sunday.

ADVERTISEMENT

The room is staffed 20 hours a day, stepping up to 24 as polls approach, and those on duty are not allowed to take outside meetings or long lunches. Their job is to spot disinformation campaigns, whether driven by foreign spies or domestic charlatans, as they develop – and, if possible, to stop them.

There was a time when Facebook dismissed the impact of the falsehoods its algorithms helped spread. In November 2016 its chief executive, Mark Zuckerberg, said fake news was "a very small amount of the content" and called it "pretty crazy" to think it influenced the US election.

Facebook worker Erin Landers leaves the company's "War Room" during a media demonstration on October 17 - Credit: NOAH BERGER/AFP
Facebook worker Erin Landers leaves the company's "War Room" during a media demonstration on October 17 Credit: NOAH BERGER/AFP

But since then, under pressure from national governments and the media, his company has changed course, rewriting its policies, hiring a dedicated investigations team and doubling the number of people working on "safety and security" from 10,000 to 20,000.

The latest stage of that effort is the war room, a converted conference room that Facebook showed off to journalists on Wednesday and which symbolises Facebook's hesitant acceptance of the global power it now wields.

"We've been hard at work for the last two years to make sure that we're far more prepared than we were in 2016," said Samidh Chakrabarti, Facebook's head of civic engagement. "We know that in an election every moment counts, and that's why we're committed to getting it right this time... everyone across the company has a deep sense of responsibility to make sure that our platforms are safe and secure fora ll elections."

The purpose of the war room, beyond showing the world that Facebook now takes elections very seriously, is speed. The staff here represent thousands of other workers in 20 departments across Facebook and its subsidiaries Whatsapp and Instagram, from threat investigators through back-end engineers to content moderators. Centralising them under one roof means they can work together quickly, even across multiple time zones. "When decision-making needs to be fast, there's no substitute for face to face interaction," Mr Chakrabarti admits. 

A man works at his desk in front of monitors during a demonstration in the war room, where Facebook monitors election related content on the platform - Credit: Jeff Chiu/AP
Trained investigators in the "war room", some of whom have backgrounds in intelligence, work to determine who is spreading the disinformation and how they are doing it Credit: Jeff Chiu/AP

First, automated systems continually scan for strange activity such as floods of foreign political content or spikes in reports from anxious users. Much of this information is visible to war room staff in constantly-updated graphs in graphs and charts. Anomalies trigger an alarm and are flagged on a "situation board".

Trained investigators, some of whom have backgrounds in intelligence, work to determine who is spreading the disinformation and how they are doing it, while moderators decide if the content breaks Facebook's rules. If it does, it can be removed en masse using AI, along with the networks of bots and fake accounts behind it. If it does not, it may be referred to third-party fact-checkers, who have the power to cut its distribution and tag it with a health warning. 

Most of this work was already happening, but the war room has given it a boost. On day of the first round of Brazil's election, for example, Facebook staff noticed a story on the verge of going viral which falsely claimed that the polls would be delayed by one day due to protests.

That violated Facebook's policy against voter suppression, and was swiftly removed. After polls closed, there was a surge in hate speech targeted at one particular region of Brazil; again, it was detected, judged and removed in large numbers. This, Mr Chakrabarti said, happened at "unprecedented speed" – taking only "a couple of hours" where previously it might have taken "days". 

A worker walks down a hallway at Facebook headquarters on October 17, 2018, in Menlo Park, California - Credit: Noah Berger/AFP
Facebook headquarters in Menlo Park, California, where the "war room" is situated Credit: Noah Berger/AFP

In these efforts, Facebook is limited by its own technology. Much of the fake news in Brazil's election travels not on Facebook but on Whatsapp, a private chat app where all messages are encrypted. That means the company cannot tell who is sharing what and cannot remove content.

Instead it relies on user reports and on AI to detect and ban spam accounts. It has cut back users' ability to forward messages to each other, as well as adding a label to such messages indicating that they are not original, but declined to implement stronger controls. A study found that 56 per cent of the most-shared images were misleading, and only 8 per cent were "fully true"; one fake video, showing a vote for the current frontrunner Jair Bolsonaro being rejected by a voting machine, was shared by Mr Bolsonaro's own son. "How can [Facebook's efforts] be effective if we're seeing so many fake accounts?" asked one Brazilian journalist.

Facebook is also limited by its own policies. Since 2016 it has applied a range of sanctions to news that is deemed by outside fact checkers to be untrue or misleading, from restricting the revenue it generates to warning users who try to share it that it has been debunked.

Facebook also uses AI to aggressively go after bots and fake accounts, which are often behind the spreading of fake news; between October 2017 and March 2018 it blocked or removed almost 1.3 billion, mostly at the moment they were created. The strategy, Mr Chakrabarti said, is to attack "the financial incentives for creating fake news in the first place" and block its most common sources. But Facebook has continually resisted banning fake news outright, insisting that it wants to police behaviour, deceptive content. A recent purge of American users was justified by saying they were spammers, not liars. 

Photos of Brazilian election candidates and a clock counting down the election in Brazil are shown on a wall in the "war room" - Credit: Jeff Chiu/AP
Photos of Brazilian election candidates and a clock counting down the election in Brazil are shown on a wall in the "war room" Credit: Jeff Chiu/AP

This, according to Renee DiResta, a researcher who investigates digital propaganda, is probably the right approach. "Stepping into the role of content moderation, on a platform of this size, opens [Facebook] up to allegations of bias, allegations of censorship," she told the Telegraph. Focusing on "malicious behaviour" while outsourcing questions of truth to approved media organisations helps avoid that problem. But even the measures Facebook has already taken are controversial: a list of adverts removed for being "political" included prostate cancer screenings for African-American men, free delivery from a Mexican restaurant chain and an LGBT youth prom.

Finally, Facebook is limited by its own business model. "The very serious problem we face is that the social ecosystem was designed to help like-minded people find each other and to help content go viral," said Ms DiResta. "It... gave consumers an experience they enjoyed, and also turned out to be remarkably effective for propagandists. It's very hard for platforms to fight back against adversaries using the tools to do what they were designed to do. This is why third-party partnerships and outside oversight are so important."

Facebook now has that, for its every action is closely scrutinised by journalists and academics. Its next test will be the Brazilian run-off on October 28, and then the US midterm elections on November 6.  If the war room performs well, it may become a permanent fixture. But even if not, its existence shows that Facebook has finally acknowledged its self-assumed role as a global speech regulator – a role which which commits it to an endless and possibly thankless fight against spies, trolls and clickbait merchants.

"This isn't going to stop after the midterms," said Katie Harbath, Facebook's director of government outreach. "This is going to be a constant arms race. This is our new normal."