Advertisement
UK markets close in 5 hours 34 minutes
  • FTSE 100

    8,228.13
    +48.45 (+0.59%)
     
  • FTSE 250

    20,366.99
    +35.19 (+0.17%)
     
  • AIM

    765.85
    +1.38 (+0.18%)
     
  • GBP/EUR

    1.1808
    +0.0003 (+0.03%)
     
  • GBP/USD

    1.2649
    +0.0008 (+0.06%)
     
  • Bitcoin GBP

    48,604.60
    +268.58 (+0.56%)
     
  • CMC Crypto 200

    1,278.53
    -5.30 (-0.41%)
     
  • S&P 500

    5,482.87
    +4.97 (+0.09%)
     
  • DOW

    39,164.06
    +36.26 (+0.09%)
     
  • CRUDE OIL

    82.58
    +0.84 (+1.03%)
     
  • GOLD FUTURES

    2,340.00
    +3.40 (+0.15%)
     
  • NIKKEI 225

    39,583.08
    +241.54 (+0.61%)
     
  • HANG SENG

    17,718.61
    +2.14 (+0.01%)
     
  • DAX

    18,345.89
    +135.34 (+0.74%)
     
  • CAC 40

    7,507.15
    -23.57 (-0.31%)
     

Scarlett Johansson’s OpenAI clash is just the start of legal wrangles over artificial intelligence

<span>Scarlett Johansson said OpenAI’s ChatGPT update used a voice ‘eerily similar’ to hers.</span><span>Photograph: Gareth Cattermole/Getty Images</span>
Scarlett Johansson said OpenAI’s ChatGPT update used a voice ‘eerily similar’ to hers.Photograph: Gareth Cattermole/Getty Images

When OpenAI’s new voice assistant said it was “doing fantastic” in a launch demo this month, Scarlett Johansson was not.

The Hollywood star said she was “shocked, angered and in disbelief” that the updated version of ChatGPT, which can listen to spoken prompts and respond verbally, had a voice “eerily similar” to hers.

One of Johansson’s signature roles was as the voice of a futuristic version of Siri in the 2013 film Her and, for the actor, the similarity was stark. The OpenAI chief executive, Sam Altman, appeared to acknowledge the film’s influence with a one-word post on X on the day of the launch: “her”.

ADVERTISEMENT

Related: ChatGPT suspends Scarlett Johansson-like voice as actor speaks out against OpenAI

In a statement, Johansson said Altman had approached her last year to be a voice of ChatGPT and that she had declined for “personal reasons”. OpenAI confirmed this in a blogpost but said she had been approached to be an extra voice for ChatGPT, after five had already been chosen, including the voice that had alarmed Johansson. She was approached again days before the 13 May launch, OpenAI added, about becoming a “future additional voice”.

OpenAI wrote that AI voices should not “deliberately mimic a celebrity’s distinctive voice” and that the voice in question used by the new GPT-4o model, Sky, was not an imitation of Scarlett Johansson but “belongs to a different professional actress using her own natural speaking voice”.

The relationship between AI and the creative industries is already strained, with authors, artists and music publishers bringing lawsuits over copyright infringement, but for some campaigners the furore is emblematic of tensions between wider society and a technology whose advances could leave politicians, regulators and industries trailing in its wake.

Christian Nunes, the president of the National Organization for Women, which has spoken out on the issue of deepfakes, said “people feel like their choice and autonomy is being taken from them” by the technology, while Sneha Revanur, the founder of Encode Justice, a youth-led group that campaigns for AI regulation, said the Johansson row highlighted a “collapse of trust” in AI.

OpenAI, which has dropped Sky, wrote in another blogpost this month that it wanted to contribute to the “development of a broadly beneficial social contract for content in the AI age”. It also revealed it was developing a tool called Media Manager that would allow creators and content owners to flag their work and whether they wanted it included in training of AI models, which “learn” from a mass of material taken from the internet.

Related: As the AI world gathers in Seoul, can an accelerating industry balance progress against safety?

When OpenAI talks of a social contract, however, the entertainment industry is seeking something more concrete. Sag-Aftra, the US actors’ union, feels this is a teachable moment for the tech industry.

Jeffrey Bennett, the Sag-Aftra general counsel, says: “I am willing to bet there are quite a few companies out there that don’t even understand that there are rights in voice. So there is going to be a lot of education that has to happen. And we are now prepared to do that, aggressively.”

Sag-Aftra, whose members went on strike last year over a range of issues that included use of AI, wants a person’s image, voice and likeness enshrined as an intellectual property right at federal – or countrywide – level.

“We feel like the time is urgent to establish a federal intellectual property right in image, voice and likeness. If you have an intellectual property right at the federal level you can demand online platforms take down unauthorised uses of digital replicas,” Bennett says.

To that end, Sag-Aftra is backing the No Fakes Act, a bipartisan bill in the US Senate that seeks to protect performers from unauthorised digital replicas.

Chris Mammen, a partner and specialist in IP at the US law firm Womble Bond Dickinson, sees an evolving relationship between Hollywood and the tech industry.

“I think the technology is developing so rapidly, and potential new uses of the technology also being invented almost daily, there are bound to be tensions and disputes but also new opportunities and new deals to be made,” he said.

When Johansson made her comments on 20 May, she said she had hired legal counsel. It is unclear if Johansson is considering legal action, now that OpenAI has withdrawn Sky. Johansson’s representatives have been contacted for comment.

However, legal experts contacted by the Guardian believe she could have a basis for a case and point to “right of publicity” claims that can be brought under state law, including in California. The right of publicity protects someone’s name, image, likeness and other distinguishing features of their identity from unauthorised use.

“Generally, a person’s right of publicity can be deemed violated when a party uses the person’s name, image, or likeness, including voice, without his or her permission, to promote a business or product,” said Purvi Patel Albers, a partner at the US firm Haynes Boon.

Even if Johansson’s voice was not used directly, there is precedent for a lawsuit from a case brought by the singer Bette Midler against the Ford Motor Company in the 1980s, which had used a Midler impersonator to replicate her singing voice in a commercial. Midler won in the US court of appeals.

“The Midler case confirms that it does not have to be an exact replica to be actionable,” Albers said.

Mark Humphrey, a partner at the law firm Mitchell Silberberg & Knupp, said Johansson had “some favourable facts” such as the “her” post and the fact OpenAI approached her again shortly before the launch.

“If everything OpenAI has claimed is true, and there was no intent for Sky to sound like Ms Johansson, why was OpenAI still trying to negotiate with her at the 11th hour?” However, Humphrey added that he had spoken to people who thought Sky did not sound like Johansson. The Washington Post reported a statement from the actor behind Sky, who said she had “never been compared” to Johansson by “the people who do know me closely”.

Daniel Gervais, a law professor and intellectual property expert at Vanderbilt University, said Johansson would face an “uphill battle” even if states like Tennessee had recently expanded their right of publicity law to protect an individual’s voice.

“There are a few state laws that protect voice in addition to name, image and likeness, but they have been tested. They are being challenged on a variety of grounds, including the first amendment,” he said.

As the use and competence of generative AI grows, so will the legal battles around it.