- Oops!Something went wrong.Please try again later.
The former facebook employee has provoked an onslaught of criticism of Mark Zuckerberg’s company by releasing tens of thousands of internal Facebook documents outlining the firm’s failure to keep harmful content off its platforms (as well as its eponymous social network, Facebook owns Instagram, Facebook Messenger and WhatsApp). She has already testified to US senators this month at a hearing in which she accused the company of putting “astronomical profits before people”.
Haugen will be questioned at 2pm by MPs and peers looking at the online safety bill, which imposes a duty of care on social media companies to protect users from harmful content, or face substantial fines levied by the communications regulator, Ofcom. Boris Johnson has promised to fast-track the bill.
Here are some questions that Haugen could be asked on Monday.
How will Facebook cope with the online safety bill requirements?
The online safety bill splits the duty of care into three areas: preventing the proliferation of illegal content and activity such as child pornography, terrorist material and hate crimes such as racial abuse; ensuring children are not exposed to harmful or inappropriate content; and, for the big players such as Facebook and Twitter, ensuring that adults are protected from legal but harmful content. Is Facebook capable of meeting that duty of care requirement given a key part of Haugen’s senate testimony, and document leaks, claims that Facebook struggles to protect people from harmful content on its platforms?
Should tech executives face criminal sanctions?
Boris Johnson sent shivers of fear through Silicon Valley last week when he said: “We will have criminal sanctions with tough sentences for those who are responsible for allowing this foul content to permeate the internet.” Government sources have since rowed back on this – the bill is merely holding in reserve the option of criminal liability for executives who don’t help Ofcom adequately – but the issue will continue to be debated. In an interview with the Observer at the weekend, Haugen preferred to focus on the systems that social media companies (and other companies within the scope of the bill like YouTube and TikTok) will be required to put in place to eradicate harmful content.
Should Facebook be compelled to release its internal research?
The joint committee has heard warnings that requiring social media companies to release their internal research will simply result in them scrapping their research departments. Haugen believes there should at least be a formal arrangement between Facebook and regulators whereby any concern about its platforms is immediately dealt with. One solution could be to make as much Facebook data available as possible – once privacy issues have been taken into account – to third party sources such as academics and journalists, who will then write up their findings. Twitter is working on a system that would make large datasets available to independent researchers.
What can be done about algorithms?
Algorithms are used widely by social media companies and tailor the content that users see. Last week, Twitter admitted its algorithm amplified rightwing over leftwing politicians and it didn’t fully know why. The internal documents released by Haugen have shown that a change to the algorithm on Facebook’s news feed – a key part of how users interact with the platform – led to more sharing of divisive content and misinformation. Haugen is calling on Facebook to make its news feed chronological – simply ordering posts in the order in which they are posted – although the company says that option is already available (it is also working on ways to make that feature easier to find). The draft bill already contains a provision allowing Ofcom to inspect algorithms. Does that provision go far enough?
Can Facebook be trusted?
Haugen has made clear that the answer to this is “no”, unless Facebook undergoes substantial reform. She released tens of thousands of documents because she believed Facebook was not doing enough to tackle the harms that were being committed on its platforms, which have 2.8 billion daily users. She believes that Mark Zuckerberg, the Facebook founder, chief executive and controlling shareholder, is a barrier to change. “He has not demonstrated that he is willing to govern the company at the level that is necessary for public safety.”
Facebook says it has a commercial and moral incentive to make the platform a positive place. “Yes, we’re a business and we make profit, but the idea that we do so at the expense of people’s safety or wellbeing misunderstands where our own commercial interests lie.”