“When the looting starts, the shooting starts.”
Donald Trump’s inflammatory and offensive phrase, quoting a civil rights-era police chief accused of bigotry and brutality against black people, sent to the world via Twitter and Facebook, could mark a historic global turning point.
Posting these words during nationwide protests over racial injustice after the alleged murder by a white police officer of George Floyd, a black man, was deeply irresponsible.
However, for Trump, the social media president, it received a far more important censure: it violated Twitter’s terms of service against glorifying violence and so was covered up with a warning.
By taking the unprecedented step of limiting access to a statement by the US president, Twitter has openly broken ranks from Google and Facebook, the other two companies which dominate how information is disseminated in today’s world.
Pressure is now building on Mark Zuckerberg to remove a copy of the same statement by the president on Facebook. On Monday, Facebook employees staged a walkout, angry that the company they work for, and the founder who dominates it, have refused to take a similar stance to Twitter.
Online therapy firm TalkSpace pulled out of partnership discussions with the company, which it said was a platform that “incites violence, racism and lies”.
The following day, three civil rights leaders said after meeting with Zuckerberg that he was setting a “dangerous precedent” by allowing Trump’s post to stay online with no warning. Zuckerberg's explanations for inaction were "incomprehensible", they said.
Whether and to what extent Facebook listens to this criticism, follows Twitter’s lead and finally takes responsibility for content that it publishes, has big implications in a pivotal year for America and the world.
Zuckerberg has a chance, before a vital US election and in the midst of nationwide protests against police brutality and a global pandemic, to stem the spread of lies, misinformation and hatred that have characterised the social media era.
To do so, he will have to face up to two incongruous elements of the business models of Google and the social media platforms. First, they have always maintained that they are passive purveyors’ of others’ content, meaning they are not responsible for it and don’t have to pay for using it. But, second, their algorithms have for years tailored what individuals see using ever more sophisticated means and ever growing piles of personal data.
A significant factor in this secret algorithmic sauce is “engagement”, which frequently means that posts which trigger the most emotional response, those that make us angry or fearful, or that fan the flames of our prejudices, rise to the top and are shown to more people.
By acting in this way the platforms have, by definition, ceased to become passive; they are content curators and publishers. If that is the case, and a growing number of politicians in the UK, US, EU and beyond have said that it is, then, the lawmakers argue, platforms must take responsibility for the content they publish. The fact that editorial decisions about what to show to whom are made by a computer, rather than a human being, cannot be an adequate defence.
Twitter has now gone a stage further by taking human decisions to first fact-check and then partially censor the president’s tweets.
A line has been crossed. Responsibility has been accepted. It is difficult to see how Twitter can avoid labelling future tweets containing the president’s untruths and glorifications of violence.
While America’s cities burn, and protesters are fired on with tear gas and rubber bullets this is at least one positive development for democracies, which require an informed electorate to function.
On social media there is frequently little to no differentiation between facts, assertion and lies. This has allowed objective, provable facts to be undermined and questioned while people, including those in power, are able to posit their own ”truths”, direct and unfiltered, to millions of people.
No one has taken more advantage of this than Trump. But a question remains as to why would Zuckerberg would not take stronger action, given that the Facebook chief executive’s own reputation is taking a battering for not doing so.
Zuckerberg’s publicly stated position on this is hard for many people to believe. He says that Facebook can and should be able to leave up posts containing lies and misinformation (and even take money for paid political adverts that contain lies) because to do otherwise would impinge on free speech.
Roger McNamee, an early investor in Facebook who has now become a staunch critic, points out that the company’s problem is that its business model is based on monopolising users’ attention. It does this first by creating addiction through regular notifications and then by presenting us with the content our brains cant resist feasting on.
Often, says McNamee, this content happens to be what triggers our fight-or-flight response: hate speech, lies, and things that confirm our existing beliefs, prejudices and biases. To clamp down on Trump’s comments would undermine that business model - the source of both Zuckerberg’s and Trump’s power.
Brittany Kaiser, the former Cambridge Analytica employee who worked on Trump's 2016 campaign, offers a side of the story that suggests another explanation for Mr Zuckerberg’s unwillingness to act.
In her book, Targeted: The Cambridge Analytica whistleblower’s inside story of how big data and Facebook broke democracy and how it can happen again, she recounts how Facebook and Cambridge Analytica worked closely with Trump campaign officials, targeting millions of ads at voters in key swing states.
She describes her disquiet about what she came to see as manipulation of voters on a previously unimaginable scale; a mass psychology experiment in which personal data was harvested via Facebook without valid consent and then used to target extremely tailored messages to millions of people.
Defenders of Facebook argue that this is little different to previous political campaigning methods, but the speed and precision with which technology allows people to be psychologically influenced is of a different order of magnitude to what has gone before.
Kaiser describes a process whereby thousands of adverts were tweaked slightly, perhaps a different colour, a miniscule change in phrasing, a new lie. They were then minutely targeted based on people's age, location, gender, what groups they’ve interacted with, what posts they've liked, who their friends are and much more.
Campaign operatives, and Facebook staff, then observed behavioural feedback loops involving millions of voters again and again over weeks in the run-up to the election until they had honed the targeted messages most likely to push each group’s specific psychological buttons.
Given the closeness of the relationship Kaiser describes, it would be easy to conclude that Facebook has hitched its wagon to Trump and the two have a mutually beneficial relationship.
The president needs Facebook and Twitter to get his unfiltered, often inaccurate, message directly to voters. By the Washington Post’s count, Trump has made more than 18,0000 false or misleading claims since the start of his presidency.
There is some evidence too that Zuckerberg feels he needs Trump and fears what a Democratic presidency would mean for his company. Last year, Zuckerberg told Democrat senator and former presidential candidate Elizabeth Warren he would “go to the mat” to fight against her plans to break up Facebook.
Warren no longer has a shot at the presidency, but she and other notable Democrats have said they aim to crack down on the company and more generally on misinformation and extreme material circulating on social media.
Trump’s challenger Joe Biden told the New York Times in January: “I’ve never been a fan of Facebook, as you probably know. I’ve never been a big Zuckerberg fan. I think he’s a real problem.”
In the upcoming presidential election, this creates a potentially dangerous confluence of interests between Donald Trump – the most powerful man in the world and a proven liar – and Mark Zuckerberg – the man in charge of probably the most powerful platform for mass influence of opinion ever created.
That is reason enough to alarm us all. But, with pressure mounting, it is now more conceivable than ever that Zuckerberg might change tack before November’s election. If the current crisis, with its scenes of carnage; of police decked out in army gear, firing tear gas and rubber bullets at protestors, egged on to violence by their own president via social media, does not cause Facebook’s chief executive to act, then perhaps – one way or another – it will be too late.