The fake AI Scarlett Johansson is a reality check for Washington

  • Oops!
    Something went wrong.
    Please try again later.

A public fight between Scarlett Johansson and OpenAI steers straight into a live debate in Washington about one of the most unnerving uses of artificial intelligence. But it’s not clear that even an A-list celebrity can shake loose Washington’s perennially stuck approach to Big Tech.

Johansson said she was “shocked, angered and in disbelief” that a new artificial voice named “Sky” used by OpenAI bore such a close resemblance to her own. In a statement, she threatened legal action.

OpenAI denied that Sky was a copy of Johansson, but paused use of the voice. “We are sorry to Ms. Johansson that we didn’t communicate better,” Open AI CEO Sam Altman said in a statement provided to POLITICO.

Altman and his company have emerged in the last year as flag-carriers for the AI industry in Washington, retailing a vision for the future that has helped shape the capital’s approach to AI regulation.

But the new argument between a high-profile actor and the highest-profile AI company is triggering worries about the often reckless approach the tech industry can take to innovation. It’s also elevating a particular concern that has been worrying lawmakers since generative AI landed on the public radar: AI’s capacity to replicate actual people without their consent.

A bipartisan bill called the NO FAKES Act, drafted last year, would allow people to sue the creators and distributors of unauthorized AI-generated digital replicas. Lawmakers introduced a similar House bill, called the NO AI FRAUD Act, earlier this year with the authors promising to “fine-tune” it after a field hearing in Los Angeles.

The Johansson case underscores the “frankly disturbing threat” of unauthorized AI ripoffs, Sen. Chris Coons (D-Del.), who co-sponsored the NO FAKES draft, wrote to POLITICO. He has said he plans to introduce the bill in the Senate next month.

Despite concerns raised by Johansson, lawmakers, other performers and actors unions, it’s not clear that the bills have any kind of path to the floor of either chamber. The Senate’s major roadmap for AI legislation, released last week, raised the question of “whether there is a need for legislation that protects against the unauthorized use of one’s name, image, likeness, and voice, consistent with First Amendment principles, as it relates to AI” — but did not propose any solutions.

One of the key leaders on Senate AI policy, Sen. Todd Young (R-Ind.), who worked alongside Majority Leader Chuck Schumer to develop the roadmap, said he had not seen the draft NO FAKES Act, but that “it needs to pass out of committee before the floor takes it up … Let’s allow regular order to work.”

As for Johansson’s case, Young told POLITICO it would need to be litigated in court.

The worry about fake voices has attracted attention beyond Congress. The Federal Trade Commission in February finalized a rule banning impersonation of government and business and proposed new protections against AI impersonation of individuals. The agency declined to comment on Johansson’s allegations.

Biden’s sprawling October executive order does not directly mention the likeness issues raised by AI, but taps a handful of agencies, including the U.S. Patent and Trademarks Office, to eventually issue guidance that addresses the numerous copyright issues miring the technology.

As much headline attention as Johansson’s complaint has attracted, advocates had little hope it could break the persistent logjam on tech regulation — in which companies for years have avoided new rules, even after high-profile incidents.

Robert Weissman, president of the consumer rights advocacy group Public Citizen, said he was skeptical Johansson’s experience would change the landscape. “We've had high-profile examples — we had the Taylor Swift deepfake,” he said, referring to an AI-generated, sexually explicit image of Swift that circulated in January, “we had the Joe Biden fake call. It's not clear that those kinds of moments are enough to break through Congress's strong inertia.”

Outside the capital, the issue has become a key point of tension in California, where lawmakers have moved faster on tech issues than in Washington — and are being lobbied by representatives of two influential power bases: Silicon Valley and Hollywood.

A year after AI concerns helped drive a prolonged Hollywood strike, the SAG-AFTRA union representing entertainers is now sponsoring a California bill limiting the use of digital likenesses.

The California debate has produced some bizarre conversations. Another bill would penalize the unauthorized AI recreations of dead celebrities, enshrining what California author Assemblymember Rebecca Bauer-Kahan called "the right to not be reanimated without their consent."

Both bills have drawn industry opposition, with major tech firms and the Motion Picture Association warning they would suppress free speech and lead to costly court fights.

A SAG-AFTRA spokesperson wrote in a statement to POLITICO that the union shares the actor’s concerns. "We thank Ms. Johansson for speaking out on this issue of crucial importance to all SAG-AFTRA members,” the union wrote.

The union has also supported the federal NO FAKES bill, while the Motion Picture Association, representing studios, warned it could be overbroad.

In a hearing in Congress last month, MPA senior vice president Ben Sheffner said it would take “very careful drafting to accomplish the bill's goals without inadvertently chilling or even prohibiting legitimate, constitutionally protected uses of technology to enhance storytelling.”

Not everyone who has been cloned by AI is an actor — and not everyone has a problem with it. Psychologist Martin Seligman said he saw benefit when a former student turned his works into a chatbot that sounds like him.

In an interview last year, he said: “It is enchanting to me that what I’ve written about and discovered over 60 years in psychology could be useful to people long after I was dead."

Christine Mui and Jeremy White contributed to this report.