Advertisement
UK markets closed
  • FTSE 100

    8,213.49
    +41.34 (+0.51%)
     
  • FTSE 250

    20,164.54
    +112.21 (+0.56%)
     
  • AIM

    771.53
    +3.42 (+0.45%)
     
  • GBP/EUR

    1.1652
    -0.0031 (-0.26%)
     
  • GBP/USD

    1.2546
    +0.0013 (+0.11%)
     
  • Bitcoin GBP

    50,441.55
    +112.59 (+0.22%)
     
  • CMC Crypto 200

    1,321.87
    +44.90 (+3.52%)
     
  • S&P 500

    5,127.79
    +63.59 (+1.26%)
     
  • DOW

    38,675.68
    +450.02 (+1.18%)
     
  • CRUDE OIL

    77.99
    -0.96 (-1.22%)
     
  • GOLD FUTURES

    2,310.10
    +0.50 (+0.02%)
     
  • NIKKEI 225

    38,236.07
    -37.98 (-0.10%)
     
  • HANG SENG

    18,475.92
    +268.79 (+1.48%)
     
  • DAX

    18,001.60
    +105.10 (+0.59%)
     
  • CAC 40

    7,957.57
    +42.92 (+0.54%)
     

Facebook moderator sues after developing PTSD from viewing disturbing content

The lawsuit says Facebook did not do enough to protect its moderators from harm - REUTERS
The lawsuit says Facebook did not do enough to protect its moderators from harm - REUTERS

A former Facebook moderator is suing the company for failing to protect her from the trauma she suffered while combing through thousands of disturbing images and videos.  

Selena Scola, who worked at Facebook’s Menlo Park headquarters between June 2017 and March 2018, developed post-traumatic stress disorder (PTSD) after witnessing “acts of extreme and graphic violence”. 

The class action lawsuit claims that Facebook and its contractor Pro Unlimited created dangerous work conditions for thousands of contractors by failing to provide adequate training and counselling in defiance of its own guidelines. 

ADVERTISEMENT

Facebook employs at least 7,500 human moderators across 50 languages, often through contractors, who manually search through millions of posts every week to check if they should be removed. The content can include images of sexual abuse and murder, terrorist videos, illegal pornography and even live broadcasts of people committing suicide.  

“It is well-documented that repeated exposure to such images can have a profoundly negative effect on the viewer,” said Korey Nelson, a lawyer acting for Ms Scola."Facebook is ignoring its duty to provide a safe workplace and instead creating a revolving door of contractors who are irreparably traumatized by what they witnessed on the job.”

The suit alleges that Facebook ignored guidelines for dealing with traumatic content drawn up in 2015 by the Technology Coalition, an industry body which Facebook partly funds. The guidelines recommend extensive psychological screening for new employees and mandatory counselling during the job, as well as technical measures to reduce the impact of the content.

For instance, some companies pixelate or blur images under review, altering the colours to reduce their impact. Some play video without audio, or provide a decompression period.

Facebook, the suit claims, “ignored” these guidelines, leaving Ms Scola with PTSD that can be triggered when she touches a computer mouse, watches violence on TV or hears loud noises. She now wants to force Facebook and Pro Unlimited to set up an ongoing fund to cover former moderators' medical bills. 

A spokesman for Facebook said the company was reviewing the claim and that on-site counselling was available at Ms Scola's office while she worked there. 

Every day, Facebook users post millions of videos, images, and livestreamed broadcasts of child sexual abuse, rape, torture, bestiality, beheadings, suicide, and murder.

Lawsuit on behalf of Selena Scola

He said: “We recognise that this work can often be difficult. That is why we take the support of our content moderators incredibly seriously, starting with their training, the benefits they receive, and ensuring that every person reviewing Facebook content is offered psychological support and wellness resources.”

A blog post in July by Ellen Silver, vice president of operations, said all moderators are screened for mental resilience before hiring, given a minimum of 80 hours of training and “have access to mental health resources” as well as ongoing coaching.

But evidence from former moderators suggests these provisions are limited or relatively new. Sarah Katz, a who worked as a moderator in 2016 and is now a science fiction author, told the Telegraph they were “news to [her]”. 

Ms Katz said she was given minimal training, no screening and no psychological support. She also said she worked under a strict quota of one post per minute, something Ms Silver’s blog post categorically denied. 

Another moderation contractor told the Guardian last year that they were “underpaid and undervalued”, saying training and support was “absolutely not sufficient”.

In a German newspaper, a moderator who worked in Berlin described her job as a “production line” in which workers were repeatedly exposed to traumatic content without time to reflect or process it. She quit after three months when she started to feel herself becoming desensitised in everyday life.