Advertisement
UK markets closed
  • FTSE 100

    7,952.62
    +20.64 (+0.26%)
     
  • FTSE 250

    19,884.73
    +74.07 (+0.37%)
     
  • AIM

    743.26
    +1.15 (+0.15%)
     
  • GBP/EUR

    1.1695
    +0.0026 (+0.23%)
     
  • GBP/USD

    1.2635
    -0.0003 (-0.02%)
     
  • Bitcoin GBP

    56,001.75
    +1,308.55 (+2.39%)
     
  • CMC Crypto 200

    885.54
    0.00 (0.00%)
     
  • S&P 500

    5,252.62
    +4.13 (+0.08%)
     
  • DOW

    39,776.37
    +16.29 (+0.04%)
     
  • CRUDE OIL

    82.78
    +1.43 (+1.76%)
     
  • GOLD FUTURES

    2,241.10
    +28.40 (+1.28%)
     
  • NIKKEI 225

    40,168.07
    -594.66 (-1.46%)
     
  • HANG SENG

    16,541.42
    +148.58 (+0.91%)
     
  • DAX

    18,492.49
    +15.40 (+0.08%)
     
  • CAC 40

    8,205.81
    +1.00 (+0.01%)
     

Facebook whistleblower says riots and genocides are the ‘opening chapters’ if action isn’t taken

The Facebook whistleblower Frances Haugen has said that events such as the 6 January US Capitol riot and genocides in Myanmar and Ethopia are the “opening chapters” of worse events if action is not taken against the social media company.

Ms Haugen gave the warning while giving evidence to parliament ahead of the government’s development of an Online Harms Bill.

“Engagement-based ranking prioritises and amplifies divisive, polarising content”, Ms Haugen said, adding that the company could make non-content-based choices that would sliver off half-percentage points of growth but “Facebook is unwilling to give up those slivers for our safety”.

The “opening chapters” of this “novel”, Ms Haugen said, will be “horrific” to read in both the global south and in western societies.

ADVERTISEMENT

Her evidence comes as a number of files were shared with a variety of media publications about internal research Facebook conducted.

It has been revealed that Facebook lacked misinformation classifiers in Myanmar, Pakistan, and Ethiopia – countries designated at highest risk last year.

Countries like Brazil, India and the United States were placed in “tier zero”, with “war rooms” that would monitor the geographic spaces continuously.

“Facebook never set out to prioritise polarising content, it just rose as a side effect of priorities it did take”, Ms Haugen said. She also emphasised the need for local languages and dialects to be supported.

“UK English is sufficiently different that I would be unsurprised if the safety systems that they developed, primarily for American English, would be underenforced in the UK”, she said.

The difference between systems in the United States compared to Ethiopia is stark. Facebook offers a huge range of services designed to protect the public discourse such as building artificial intelligence systems to detect hate speech in memes and quickly respond to hoaxes and incitement to violence in real-time.

Ethiopia, however, does not have the company’s community standards translated into all of its official languages, and there is not machine learning or fact-checkers available to manage content there.

“Ethiopia has a hundred million people and they speak six languages. Facebook only supports two,” Ms Haugen pointed out, and while Facebook may be removing some of the most dangerous terms from engagement-based ranking the company is not doing enough for the most vulnerable places in the world.

Ms Haugen said that inside Facebook “there is a culture that lionises a start-up ethic that, in my opinion, is irresponsible,” and that she had “no idea” who to flag her concerns to within the company because of the risk it could have to growth.

Facebook said in 2018 that it agreed with an independent report it commissioned that said it had failed to prevent its platform being used to “incite offline violence” in Myanmar.

The report said Facebook platform had created an “enabling environment” for the proliferation of human rights abuse, which culminated in violence against the Rohingya people (a stateless Muslim minority) that the UN said may amount to genocide.

Even in the United States, during the 6 January riot, many of the interventions that Facebook could have taken against speech on its platform that was used for coordinating those storming Capitol Hill were “still off at 5pm Eastern Time”, Ms Haugen said, emphasising an alleged lack of action the company takes even when democracy is at risk in its home country.

Frances Haugen giving evidence to the joint committee for the draft Online Safety Bill (UK Parliament 2021/ Annabel Moeller)
Frances Haugen giving evidence to the joint committee for the draft Online Safety Bill (UK Parliament 2021/ Annabel Moeller)

Moving to systems that are human-scaled, rather than algorithms telling people where the focus is, is the safest action, she claimed.

“We liked social media before we had an algorithmic feed”, Ms Haugen said. “Slowing the platform down, agnostic strategies, human-scale solutions. That’s the direction we need to go”.

Ms Haugen alleged that the reason Facebook does not take these actions is because the company does not want to sacrifice users being on the platform for a shorter period of time, or less revenue – allegations that have been made in previous leaks.

“Facebook has a strategy of only slowing the platform down once a crisis has begun ... rather than as the temperature gets hotter and making the platform safer as it happens”, Ms Haugen said, describing it as a “break-the-glass measure”.

These measures have nothing to do with the content itself, but were “little questions” where Facebook would optimise its algorithm for growth over safety.

Much of these problems, across all social media sites, is due to engagement-based ranking, Ms Haugen said, due to the necessity of AI within those systems to take out extremist content.

Any tech company that has a large societal impact needs to have information shared with the public, Ms Haugen said. She compared the SEC whistleblower protections she had because Facebook is a public company to TikTok, which is a private company.

“You can’t take a college class today to understand these systems inside of Facebook”, she said. “The only people who understand it are inside Facebook.”

Facebook should also be forced to publish what integrity systems exist, Ms Haugen said, claiming that a government source said that Facebook does not track self-harm content.

Self-harm content has been dangerous on Facebook-owned platforms, such as Instagram, with research from inside the company suggesting that the company was aware that its algorithms made some young girls feel worse about their body image.

“TikTok is about doing fun activities with your friends. Snapchat is about faces and augmented reality. Reddit is at least vaguely about ideas. But Instagram is about social comparison”, Ms Haugen said.

While many children had “good homes to go to” when she was younger, the “bullying [now] follows them home”, she continued.

“There is no will at the top ... to make sure these systems are run in an adequately safe way, and until we bring in a counterweight, things will be operated in the shareholders’ interest, not in the public interest.”

Ms Haugen gave comment about on anonymous or “real name” policies too, saying that Facebook already knows so much information about its users that the incremental benefits of real names on social media do not measure up against the harm it would cause groups such as domestic abuse survivors.

“The much more scalable, effective solution is thinking about how is content distributed on these platforms”, she said, adding that there is a “great asymmetry” between the parts of the company used to push growth and the parts of the company designed to keep people safe.

With regards to end-to-end encryption, which keeps messages secure from snoopers and is the basis of apps like WhatsApp and Signal, Ms Haugen also said that Facebook’s plans for encryption are worrying not because they are encrypted but because they are not open source – meaning that other people can access the code and check it for vulnerabilities and other issues.

“I am not against end-to-end encryption in Messenger, but I do think the public has a right to know [what that means]”, Ms Haugen said, “and I personally don’t trust Facebook to tell the truth.”

Facebook’s definition of what might be end-to-end encrypted could be different to the definition set by an open-source app, Ms Haugen said, and that could put people at risk especially because Facebook’s platforms include a “directory where you can find 14-year-old girls”.

“Contrary to what was discussed at the hearing, we’ve always had the commercial incentive to remove harmful content from our sites. People don’t want to see it when they use our apps and advertisers don’t want their ads next to it”, a Facebook spokesperson said. “While we have rules against harmful content and publish regular transparency reports, we agree we need regulation for the whole industry so that businesses like ours aren’t making these decisions on our own.”

In July 2020, a leaked recording of Mark Zuckerberg following the advertising boycott about racist content on Facebook revealed that advertisers’ concerns matter little to the CEO. ““We’re not gonna change our policies or approach on anything because of a threat to a small percent of our revenue, or to any percent of our revenue,” Mr Zuckerberg said.

“We make policy changes based on principles, not revenue pressures,” a Facebook spokesperson said in response, but verified the recording was accurate.

More follows...

Read More

You can download Apple’s latest big software update right now

Bitcoin price soars after Elon Musk confirms crypto holdings – follow live

Nasa finds first ever planet outside our galaxy

US being hit by huge cyber attacks, Microsoft warns

Tesla’s Full Self Driving update is finally rolling out after unknown ‘issues’

Smart bandage reveals wound healing status without needing to be taken off