Advertisement
UK markets closed
  • FTSE 100

    8,155.72
    -49.17 (-0.60%)
     
  • FTSE 250

    21,067.68
    -166.48 (-0.78%)
     
  • AIM

    784.13
    -3.54 (-0.45%)
     
  • GBP/EUR

    1.1866
    -0.0009 (-0.08%)
     
  • GBP/USD

    1.2915
    -0.0032 (-0.25%)
     
  • Bitcoin GBP

    51,562.01
    +1,197.27 (+2.38%)
     
  • CMC Crypto 200

    1,372.92
    +42.03 (+3.16%)
     
  • S&P 500

    5,505.00
    -39.59 (-0.71%)
     
  • DOW

    40,287.53
    -377.49 (-0.93%)
     
  • CRUDE OIL

    80.25
    -2.57 (-3.10%)
     
  • GOLD FUTURES

    2,402.80
    -53.60 (-2.18%)
     
  • NIKKEI 225

    40,063.79
    -62.56 (-0.16%)
     
  • HANG SENG

    17,417.68
    -360.73 (-2.03%)
     
  • DAX

    18,171.93
    -182.83 (-1.00%)
     
  • CAC 40

    7,534.52
    -52.03 (-0.69%)
     

‘Encryption is deeply threatening to power’: Meredith Whittaker of messaging app Signal

<span>Meredith Whittaker: ‘We will hold the line.’</span><span>Photograph: PR</span>
Meredith Whittaker: ‘We will hold the line.’Photograph: PR

Meredith Whittaker practises what she preaches. As the president of the Signal Foundation, she’s a strident voice backing privacy for all. But she doesn’t just spout ­hollow words.

In 2018, she burst into public view as one of the organisers of the Google walkouts, mobilising 20,000 employees of the search giant in a twin protest over the company’s support for state surveillance and failings over sexual misconduct.

Even now, after half a decade in the public eye, with ­congressional testimonies, university professorships and federal agency advisory roles under her belt, Whittaker is still firmly privacy-conscious.

ADVERTISEMENT

It’s not unusual for business leaders to politely deflect the question when asked about their pay for the CV that accompanies these interviews, for instance. It’s somewhat less common to flatly refuse to comment on their age and family. “As a privacy advocate, Whittaker doesn’t answer personal questions that could be used to deduce her passwords or ‘secret answers’ for her bank authentication,” a staff member says after the interview. “She encourages others to follow suit!”

When she left Google, Whittaker shared a note internally that made it clear that she was committed to working on the ethical deployment of artificial intelligence and organising an “accountable tech industry”. She said: “It’s clear Google isn’t a place where I can continue this work.” That clarity, and lack of willingness to compromise, has led to Signal.

The Signal Foundation, created in 2017 with $50m in funding from WhatsApp co-founder Brian Acton, exists to “protect free expression and enable secure global communication through open source privacy technology”.

Through the 80s, there’s deep unease about the idea that the NSA and GCHQ would lose the monopoly on encryption

It took over development of its messaging app, also called Signal, in 2018, and Whittaker came on board in the newly created role of president in 2022 – just in time to begin defending Signal, and encryption in general, against a wave of attacks from nation states and companies around the world.

Legislation such as Britain’s Online Safety Act (OSA) and the EU’s child sexual abuse regulation ­contained language that could be used to ban or crack private communications, while proposals by Meta to turn on end-to-end encryption for Facebook and Instagram sparked a vicious backlash from politicians such as Priti Patel, who called the plans “catastrophic” as UK home secretary.

Those attacks are nothing new, Whittaker says when we meet in the Observer’s offices. “You can go right back to 1976, when [Whitfield] Diffie and [Martin] Hellman were trying to publish the paper that introduced public key cryptography, which is the technique that allows us to have encrypted communication over the internet that works. There were intelligence services trying to prevent them.

“Through the 80s, there’s deep unease about the idea that the NSA [US National Security Agency] and GCHQ would lose the monopoly on encryption, and by the 90s, it ends up controlled under arms treaties – this is the ‘crypto wars’. You couldn’t send your code in the mail to someone in Europe; it was considered a munitions export.”

But then the huge push to commercialise the internet forced a softening – to a point. “Encryption for transactions was enabled, and large companies got to choose exactly what was encrypted. At the same time, the Clinton administration endorsed surveillance advertising as a business model, so there was an incentive to gather data about your customers in order to sell to them.”

Surveillance, she says, was a “disease” from the very beginning of the internet, and encryption is “deeply threatening to the type of power that constitutes itself via these information asymmetries”. All of which means that she doesn’t expect the fight to end any time soon. “I don’t think these arguments are in good faith. There’s a deeper tension here, because in 20 years of the development of this metastatic tech industry, we have seen every aspect of our lives become subject to mass surveillance perpetrated by a handful of companies partnering with the US government and other ‘Five Eyes’ agencies to gather more surveillance data about us than has ever been available to any entity in human history.

“So if we don’t continue to guard these little carve-outs of privacy and ultimately extend them – we have to throw some elbows to get a bit more space here – I think we’re in for a much bleaker future than we would be if we can hold this ground, and we can expand the space for privacy and free communication.”

The criticisms of encrypted communications are as old as the technology: allowing anyone to speak without the state being able to tap into their conversations is a godsend for criminals, terrorists and paedophiles around the world.

Signal either works for everyone or it works for no one. Every military in the world uses it, every politician I’m aware of

But, Whittaker argues, few of Signal’s loudest critics seem to be consistent in what they care about. “If we really cared about helping children, why are the UK’s schools crumbling? Why was social services funded at only 7% of the amount that was suggested to fully resource the agencies that are on the frontlines of stopping abuse?”

Sometimes the criticism is more unexpected. Signal was recently dragged into the US culture wars after a failed rightwing campaign to depose the new chief executive of National Public Radio, Katherine Maher, expanded to cover Signal, where Maher sits on the board of directors. Elon Musk got involved, promoting conspiracy theories that the Signal app – which he once promoted – had “known vulnerabilities”, in response to a claim that the app “may be compromised”.

The allegations were “a weapon in a propaganda war to spread disinformation”, Whittaker says. “We see similar lines of disinformation, that often appear designed to push ­people away from Signal, linked to escalations in the Ukraine conflict. We believe these campaigns are designed to scare people away from Signal on to less secure alternatives that may be more susceptible to hacking and interception.”

The same technology that brings the foundation criticism has made it popular among governments and militaries around the world that need to protect their own conversations from the prying eyes of state hackers and others.

Whittaker views this as a leveller – Signal is for all.

“Signal either works for everyone or it works for no one. Every military in the world uses Signal, every politician I’m aware of uses Signal. Every CEO I know uses Signal because anyone who has anything truly confidential to communicate recognises that storing that on a Meta database or in the clear on some Google server is not good practice.”

Whittaker’s vision is singular and does not entertain distraction. Despite her interest in AI, she is wary of combining it with Signal and is critical of apps such as Meta’s WhatsApp that have introduced AI-enabled functions.

“I’m really proud we don’t have an AI strategy. We’d have to look ourselves in the face and be like, where’s that data coming from to train the models, where’s the input data coming from? How did we get an AI strategy, given that our entire focus is on preserving privacy and not surveilling people?”

Whatever the future holds in terms of technology and political attitudes to privacy, Whittaker is adamant that its principles are an existential matter.

“We will hold the line right. We would rather fold as a going concern than undermine or backdoor the privacy guarantees that we make to people.”

CV

Age No comment.
Family No comment.
Education I studied literature and rhetoric at Berkeley before joining Google in 2006, where I learned the rest.
Pay No comment.