- Oops!Something went wrong.Please try again later.
Data scientist Frances Haugen, 37, a former Facebook product manager with its civic misinformation team, is testifying before a Senate subcommittee on Tuesday to discuss the complaints she has made to the US Securities and Exchange Commission (SEC) regarding bad practice at the social networking giant.
Ms Haugen is expected to address her allegations concering the company’s behaviour surrounding the 2020 US presidential election, its approach to hate speech and misinformation and the impact of its lifestyle app Instagram on the mental health of young women, among other issues.
Her appearance comes a day after Facebook, Instagram and WhatsApp all suffered major worldwide outages, leading the likes of Digital Context Next CEO Jason Klint to speculate that the company might have pulled the plug deliberately as a distraction from events set to unfold in Washington, DC.
“If I were Facebook and knew these eight SEC whistleblower complaints were being published today, I sure as hell would take down my own apps to distract the press,” Mr Klint said in a tweet. “They are deadly - way beyond what we've seen so far.”
Ms Haugen, who left the Big Tech giant in March this year, appeared on CBS’s 60 Minutes on Sunday evening to reveal that she was the whistleblower behind the leak of thousands of pages of potentially damaging internal documents from Facebook to The Wall Street Journal.
In an interview with veteran anchor Scott Pelley, the analyst discussed her efforts to expose what she considers to be cause for concern at Facebook and to accuse the company of prioritising inflammatory content it knows will generate engagements over public wellbeing and “paying for its profits with our safety”.
Her previous allegations, published anonymously in the WSJ in recent weeks, included that celebrities, politicians and high-profile users were treated differently by the site from ordinary people and exempted from some moderating protocols under a system known as “XCheck”, that the company’s responses to tip-offs regarding human traffickers and drug cartels using its pages was often lacklustre and that Facebook actively engaged in targeted self-promotion under “Project Amplify”.
She also revealed that the company is involved in a lawsuit with a group of its own shareholders who allege that its £3.65bn payment to the US Federal Trade Commission to resolve the Cambridge Analytica data scandal was so steep because it was designed to protect founder Mark Zuckerberg from personal liability.
Perhaps Ms Haugen’s most shocking accusation was that Facebook’s own research had revealed that Instagram was harmful to the mental health and self-esteem of teenage girls but that it had failed to take steps to address the problem, an issue set to be of particular focus in the upper chamber of Congress.
According to the company documents she obtained, an internal study found that 32 per cent of young women who said they felt bad about their bodies admitted to feeling worse after logging in to Instagram.
The analyst, who has previously worked at other Silicon Valley big beasts like Google, Pinterest and Yelp, told 60 Minutes that although she did not believe Mr Zuckerberg had set out to create a negative space: “The version of Facebook that exists today is tearing our societies apart and causing ethnic violence around the world.”
“The thing I saw at Facebook over and over again was there were conflicts of interest between what was good for the public and what was good for Facebook,” she said. “And Facebook, over and over again, chose to optimise for its own interests, like making more money.”
Ms Haugen said during the programme that she believed changes made to the algorithm dictating what appears on a user’s newsfeed in 2018 had seen the company pivot towards prioritising content designed to provoke a strong emotional reaction.
“Its own research is showing that content that is hateful, that is divisive, that is polarising, it’s easier to inspire people to anger than it is to other emotions,” she told CBS.
She also alleged that Facebook turned on safety systems during the 2020 US presidential election in order to curtail the spread of misinformation but “as soon as the election was over, they turned them back off or they changed the settings back to what they were before, to prioritise growth over safety”, enabling some of those who staged the 6 January Capitol Riot in Washington, DC, to organise via Facebook, among other platforms.
Before the programme aired, the company’s vice president of global affairs (and former Liberal Democrat leader) Nick Clegg issued an internal memo in which he wrote: “What evidence there is simply does not support the idea that Facebook, or social media more generally, is the primary cause of polarisation.”
Mr Clegg subsequently appeared on CNN on Sunday to repeat his view that it was “ludicrous” to suggest that social media was to blame for the assault on the Capitol by supporters of Donald Trump seeking to overturn the 45th president’s defeat at the ballot box.
Also responding to CBS was Lena Pietsch, Facebook’s director of policy communications, who said in a statement: “Every day our teams have to balance protecting the rights of billions of people to express themselves openly with the need to keep our platform a safe and positive place. We continue to make significant improvements to tackle the spread of misinformation and harmful content. To suggest we encourage bad content and just do nothing is just not true.
“If any research had identified an exact solution to these complex challenges, the tech industry, governments, and society would have solved them a long time ago.”
The Independent reached out to Facebook for further comment and was directed to Ms Pietsch’s earlier statement.
Yael Eisenstat, another former employee turned vocal critic of the social network, told Vox Ms Haugen’s revelations were “a big moment” for the company.
“For years, we’ve known many of these issues - via journalists and researchers - but Facebook has been able to claim that they have an axe to grind and so we shouldn’t trust what they say. This time, the documents speak for themselves.”