Advertisement
UK markets close in 5 hours 14 minutes
  • FTSE 100

    7,705.43
    -17.12 (-0.22%)
     
  • FTSE 250

    19,434.04
    -52.49 (-0.27%)
     
  • AIM

    735.57
    -1.06 (-0.14%)
     
  • GBP/EUR

    1.1695
    -0.0009 (-0.07%)
     
  • GBP/USD

    1.2685
    -0.0043 (-0.34%)
     
  • Bitcoin GBP

    49,392.38
    -4,299.96 (-8.01%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • S&P 500

    5,149.42
    +32.33 (+0.63%)
     
  • DOW

    38,790.43
    +75.63 (+0.20%)
     
  • CRUDE OIL

    82.73
    +0.01 (+0.01%)
     
  • GOLD FUTURES

    2,157.70
    -6.60 (-0.30%)
     
  • NIKKEI 225

    40,003.60
    +263.20 (+0.66%)
     
  • HANG SENG

    16,529.48
    -207.62 (-1.24%)
     
  • DAX

    17,955.63
    +22.95 (+0.13%)
     
  • CAC 40

    8,167.42
    +19.28 (+0.24%)
     

Revealed: catastrophic effects of working as a Facebook moderator

<span>Photograph: Valentin Wolf/ImageBroker/Rex/Shutterstock</span>
Photograph: Valentin Wolf/ImageBroker/Rex/Shutterstock

The task of moderating Facebook continues to leave psychological scars on the company’s employees, months after efforts to improve conditions for the company’s thousands of contractors, the Guardian has learned.

A group of current and former contractors who worked for years at the social network’s Berlin-based moderation centres has reported witnessing colleagues become “addicted” to graphic content and hoarding ever more extreme examples for a personal collection. They also said others were pushed towards the far right by the amount of hate speech and fake news they read every day.

They describe being ground down by the volume of the work, numbed by the graphic violence, nudity and bullying they have to view for eight hours a day, working nights and weekends, for “practically minimum pay”.

ADVERTISEMENT

A little-discussed aspect of Facebook’s moderation was particularly distressing to the contractors: vetting private conversations between adults and minors that have been flagged by algorithms as likely sexual exploitation.

Such private chats, of which “90% are sexual”, were “violating and creepy”, one moderator said. “You understand something more about this sort of dystopic society we are building every day,” he added. “We have rich white men from Europe, from the US, writing to children from the Philippines … they try to get sexual photos in exchange for $10 or $20.”

Gina, a contractor, said: “I think it’s a breach of human rights. You cannot ask someone to work fast, to work well and to see graphic content. The things that we saw are just not right.”

The workers, whose names have been changed, were speaking on condition of anonymity because they had signed non-disclosure agreements with Facebook. Daniel, a former moderator, said: “We are a sort of vanguard in this field … It’s a completely new job, and everything about it is basically an experiment.”

John, his former colleague, said: “I’m here today because I would like to avoid other people falling into this hole. As a contemporary society, we are running into this new thing – the internet – and we have to find some rules to deal with it.

“It’s important to create a team, for example in a social network, aiming to protect users from abusers, hate speech, racial prejudice, better pornographic software, etc. But I think it’s important to open a debate about this job. We need to share our stories, because people don’t know anything about us, about our job, about what we do to earn a living.”

Some of the moderators’ stories were similar to the problems experienced in other countries. Daniel said: “Once, I found a colleague of ours checking online, looking to purchase a Taser, because he started to feel scared about others. He confessed he was really concerned about walking through the streets at night, for example, or being surrounded by foreign people.

“Maybe because all this hate speech we have to face every day affects our political view somehow. So a normal person, a liberal person, maybe also a progressive person, can get more conservative, more concerned about issues like migrants for example. Indeed, many of the hate speech contents we receive on a daily basis are fake news … which aim to share very particular political views.”

In February, the technology site the Verge produced one of the first behind-the-scenes reports from a US Facebook contractor. Similar to their Berlin colleagues, the Americans reported that “the conspiracy videos and memes that they see each day gradually led them to embrace fringe views”, and that a former moderator “now sleeps with a gun at his side” after he was traumatised by a video of a stabbing.

Others were dealing with trauma by self-medicating. Just as the Arizona moderators were reportedly turning to drugs and alcohol, so were those in Germany. “I saw a lot of big consumer drugs in the company,” Daniel said. “We don’t have any way to destress. The company, technically, is against drugs.”

When trying to go down a more legitimate route of self-help, the American moderators complained about the psychological help that was provided. “The on-site counsellors were largely passive,” the Verge reporter Casey Newton wrote, “relying on workers to recognise the signs of anxiety and depression and seek help.”

Berlin moderators were also critical of the counselling services provided and suggested they leaned too heavily on the state’s universal healthcare.

Daniel said: “In the end, we didn’t have proper psychological support. We had some colleagues who went to the [counsellor], and when they showed that they had real problems, they were invited to go outside the company and find a proper psychologist.”

The Verge report appeared to trigger reforms. Moderators in Berlin said after the article was published there had been immediate interest from Facebook’s head office in their workload. Previously, they had been required to moderate 1,000 pieces of content a day – more than one every 30 seconds over an eight-hour shift.

In February, an official from Facebook’s Dublin office visited, John said. “This person after this meeting decided to take off the limit of 1,000. We didn’t have any limit for a while, but now they have re-established another limit. The limit now is between 400 and 500 tickets.” The new cap – or number of tickets – was half that of the previous one but still required workers achieve about a ticket a minute. However, that volume of work was what their American colleagues had faced before the reforms.

Berlin moderators have discussed whether to seek help from the unions, but say the nature of the work makes it difficult. Gina said: “I wouldn’t say no one is interested, but no one has the possibility to do something for real.”

John added: “They are so tired.”

While the moderators agreed such work was necessary they said the problems were fixable. Daniel said: “I think it’s important to open a debate about this job,” adding that the solution was simple – “hire more people”.

In a statement, Facebook said: “Content moderators do vital work to keep our community safe, and we take our responsibility to ensure their wellbeing incredibly seriously. We work closely with our partners to ensure they provide the support people need, including training, psychological support and technology to limit their exposure to graphic content.

“Content moderation is a new and challenging industry, so we are always learning and looking to improve how it is managed. We take any reports that our high standards are not being met seriously and are working with our partner to look into these concerns.”

• In the UK and Ireland, Samaritans can be contacted on 116 123 or email jo@samaritans.org or jo@samaritans.ie. You can contact the mental health charity Mind in the UK by calling 0300 123 3393 or visiting mind.org.uk. In the US, the National Suicide Prevention Lifeline is 1-800-273-8255. In Australia, the crisis support service Lifeline is 13 11 14. Other international helplines can be found at www.befrienders.org.