Advertisement
UK markets open in 3 hours 5 minutes
  • NIKKEI 225

    37,069.59
    -1,010.11 (-2.65%)
     
  • HANG SENG

    16,158.46
    -227.41 (-1.39%)
     
  • CRUDE OIL

    84.43
    +1.70 (+2.05%)
     
  • GOLD FUTURES

    2,404.70
    +6.70 (+0.28%)
     
  • DOW

    37,775.38
    +22.07 (+0.06%)
     
  • Bitcoin GBP

    49,791.75
    -102.95 (-0.21%)
     
  • CMC Crypto 200

    1,277.27
    +391.73 (+42.59%)
     
  • NASDAQ Composite

    15,601.50
    -81.87 (-0.52%)
     
  • UK FTSE All Share

    4,290.02
    +17.00 (+0.40%)
     

Russia's election ad campaign shows Facebook's biggest problem is Facebook

‘In the grand tradition of self-justification, Zuckerberg has chosen to pass off expediency as principle.’
‘In the grand tradition of self-justification, Zuckerberg has chosen to pass off expediency as principle.’ Photograph: Eric Risberg/AP

Mark Zuckerberg marked his return from paternity leave Thursday with a concerted effort to put lipstick on the pig of Facebook’s role in swaying the 2016 presidential election. In a Facebook live address from an earth-toned, glass-walled office, the chief executive laid out a series of steps the company will take to “protect election integrity and make sure that Facebook is a force for good in democracy”.

This proactive approach to a growing public relations problem is par for the course for Facebook. The company has a tendency to respond to negative press, and with US lawmakers making noise about the $100,000 in Facebook ads purchased by a Russian influence operation during the election, Zuckerberg may hope that he can pre-empt regulation.

But the problem for Zuckerberg is not just that pigs don’t look good in lipstick. The problem is that more and more people are waking up to the fact that Facebook is less little piggy than it is out-of-control Tyrannosaurus Rex whose creator thought he was building a fun and profitable theme park until it was too late.

ADVERTISEMENT

Facebook did not grow into a $500bn business whose CEO’s statements have become this century’s fireside chats by being “a force for good in democracy”. Nor was it by “making the world more open and connected” (its first mission statement) and “bringing the world closer together” (its new mission statement).

Facebook grew to its current size, influence and wealth by selling advertisements. And it sold those advertisements by convincing users to provide it with incredibly intimate information about our lives so that advertisers could in turn use that information to convince us to do things.

Some advertisers want us to buy things. Some want us to attend events. And some want us to vote a certain way.

Facebook makes all that advertising cheap and easy and astonishingly profitable by cutting out the sales staff who, in a different kind of company, at a different time, would have looked at an advertisement before it ran.

So Facebook’s systems didn’t fail when they allowed shadowy actors connected to the Russian government to purchase ads targeting American voters with messages about divisive social and political issues. They worked. “There was nothing necessarily noteworthy at the time about a foreign actor running an ad involving a social issue,” Facebook’s vice-president of policy and communications, Elliot Schrage, wrote of the Russian ads in a blogpost.

Shortly after the election, Zuckerberg attempted to shrug off the impact of Facebook in Americans’ voting decision, calling the notion that fake news swayed voters a “pretty crazy idea”. But the CEO of an advertising company can’t afford to denigrate the power of advertising too much; his revenues depend on his actual paying customers believing that Facebook can influence behavior.

Instead, Zuckerberg has constructed for himself a worldview in which every good outcome (such as 2 million people registering to vote thanks to Facebook’s advertisements) was intentional, and every bad outcome was just a mistake that can be addressed by tweaking the existing systems.

If Facebook actually wanted to restrict the ability of foreign powers to interfere with democratic elections, the obvious solution would be to implement strict standards for political ads and employ human staff to enforce them. Such a system would introduce significant friction to the company’s self-service system, however, so in the grand tradition of self-justification, Zuckerberg has chosen to pass off expediency as principle.

“We don’t check what people say before they say it, and frankly, I don’t think our society should want us to,” he said. “Freedom means you don’t have to ask permission first, and that by default you can say what you want.”

That may be true for an individual’s right to express herself, but if Facebook’s targeted, pay-per-click ad campaigns were actually about freedom, wouldn’t they be free?

Instead, the company will continue to rely on an apolitical set of “community standards” that are ill-equipped to address the complex moral and political quandaries faced by the world’s most powerful publisher, and attempt to improve artificial intelligence tools that are nowhere near intelligent enough to prevent bone-headed errors like transforming a screenshot of a rape threat into an ad for Instagram.

This should frighten us all.

Just this week, Facebook admitted that it had designated a Rohingya insurgent group as a “dangerous organization” and instructed its moderators to delete any content by or in praise of the group. That decision, which Facebook said was made by its counter-terrorism team, made sense within the strictures of Facebook’s algorithmic approach to applying “community standards”. The group, the Arakan Rohingya Salvation Army, has engaged in violent acts against Myanmar’s security forces, so Facebook will not allow it to use its powerful communications tools.

Of course, Myanmar’s military is itself engaged in what the top United Nations human rights official has described as a “textbook example of ethnic cleansing”, but attempting to apply a set of community standards to 2 billion users across the globe doesn’t leave a lot of room for nuance.

The obvious historical analogy is to imagine how Facebook would have treated Nelson Mandela and his fellow freedom fighters when they took up arms against apartheid South Africa. My guess does not make for a pretty picture. The revolution will not comply with Facebook’s community standards. But if it has enough money, it may just be allowed to purchase a Facebook ad campaign.