Here’s Why Other Celebrities Could Face Problems With AI Voice Cloning—Not Just Scarlett Johansson

OpenAI could face a potential lawsuit from actress Scarlett Johansson after she claimed the ChatGPT maker’s now pulled chatbot voice Sky sounded eerily similar to herself, the latest legal issue raised by artificial intelligence that experts told Forbes could easily affect anyone now technology has made voice cloning easier.

Experts told Forbes the legal issues at stake in Johansson’s row with OpenAI are neither unique to AI nor particularly modern, pointing to similarities with successful voice impersonation and “soundalike” lawsuits singers Bette Midler and Tom Waits brought against Ford and Frito-Lay in the 1980s.

Johansson could potentially bring a number of claims against OpenAI under existing laws, Foley & Lardner partner Jeffrey Greene and associate Arian Jabbary explained in an email, including a right to privacy, copyright, false endorsement and the right to publicity, which protects individuals from having identifying features like their name or likeness from being used without their permission and is particularly strong in California.

While potentially illegal, this means it is absolutely possible “other celebrities could potentially be targeted,” Greene and Jabbary said, adding that if so, they would likely face the same or similar issues as Johansson.

Advances in technology allowing characteristics like voices to be replicated with greater ease and accuracy have exacerbated the impersonation problem, however, and Tiffany Li, a law professor at the University of San Francisco, said the issue extends well beyond Hollywood.

“Anyone could be targeted by others trying to clone their voices using AI… not just celebrities” or public figures, Li warned, pointing to the proliferation of AI phone scams that clone peoples’ voices as a good example and urging lawmakers to ensure laws are updated “to keep pace with modern technology.”

Greene and Jabbary said Johansson’s case “underscores the growing need to establish clearer laws governing the use of an individual’s voice, image, and other personal or identifiable information,” with laws until now hampered by being “piecemeal or state specific.”

On Monday, ChatGPT pulled its chatbot voice, Sky, after concerns it sounded similar to Johansson. The actress, who said she was “shocked” over the AI voice assistant, claims the Sky so closely resembled her own voice that even close friends believed it was her. Members of the public widely made the connection as well and Johansson says OpenAI approached her twice, unsuccessfully, asking her to voice the assistant. OpenAI CEO and cofounder Sam Altman has previously said “Her” is one of his favorite movies and cryptically tweeted “her” on X after the ChatGPT update and voices were revealed. Johansson lent her voice to the role of a virtual assistant in the film. OpenAI itself has acknowledged the vocal similarities between Sky and Johansson but stressed the voice “is not an imitation” and belongs to “a different professional actress using her own natural speaking voice,” with any similarity unintentional.


Experts told Forbes there are a lot of open questions in the case and the outcome of a possible lawsuit — one has not been filed — would in large part depend on a good deal of information that is not yet available, public or verified. Stanford law professor Mark Lemley told Forbes Johansson appears to have a “pretty strong case” against OpenAI under a right to publicity claim, citing “the effort to imitate not just Johansson but the AI voice from ‘Her,’ coupled with their repeated efforts to hire Johansson and Altman’s tweet specifically referencing ‘Her.’” Greene and Jabbary stressed that while any outcome would depend on applicable legal principles and the court’s interpretation of the facts, OpenAI’s overtures are suggestive of some sort of connection between Sky and Johansson. “OpenAI wouldn’t have sought her consent had they not believed the voice was modeled after hers.”


“I think it’s important to note that this is something that can happen to anyone, but not everyone has the power and resources of a famous celebrity to fight back legally,” Li said. “That’s why we need legal protections—to protect the rest of us.”


Lemley said he does not believe new laws are needed to protect people from imitation in cases like this. “I think existing laws make this illegal, so I’m not sure we need a new law,” he said. However, Lemley said Congress is “actively considering a federal right of publicity like that in California to address the problem.” Looking ahead, he said a case like this “is only the beginning of the legal issues around soundalikes in AI,” pointing to songs purporting to be from Drake or The Weeknd.

Leave a Reply

Your email address will not be published. Required fields are marked *