Political consultant who admitted deepfaking Biden's voice is indicted, fined $6 million

Steve Kramer, the political consultant who admitted to NBC News that he was behind a robocall impersonating Joe Biden's voice, has been indicted in New Hampshire and fined $6 million by the Federal Communications Commission.

In separate announcements Thursday, New Hampshire's attorney general charged Kramer with 26 counts, while the FCC fined him $6 million for "scam calls he set up to defraud voters" in violation of a federal Caller ID law.

“New Hampshire remains committed to ensuring that our elections remain free from unlawful interference and our investigation into this matter remains ongoing," New Hampshire Attorney General John Formella said in a statement. "I hope that our respective enforcement actions send a strong deterrent signal to anyone who might consider interfering with elections, whether through the use of artificial intelligence or otherwise.”

The charges include 13 felony counts of voter suppression and 13 misdemeanor counts of impersonation of a candidate, based on 13 New Hampshire voters who received the calls.

The FCC also fined a telecom company allegedly involved in the call an additional $2 million on Thursday.

Kramer did not return text messages, and his spokesperson declined to comment.

The robocall, which was first reported by NBC News and went out to thousands of New Hampshire voters in January, just ahead of the state’s first-in-the-nation presidential primary, used artificial intelligence technology to deepfake Biden, telling voters to stay home and “save” their votes for the November general election.

It was the first known example of a deepfake’s being used in national American politics. It prompted outcry from officials and watchdogs, propelling the Federal Communications Commission to put forward a new rule banning unsolicited AI robocalls.

"[T]here is no need to travel to far-off lands to see how AI can sow confusion. Because this year in the United States a fraudulent campaign targeted voters in New Hampshire," FCC Chairwoman Jessica Rosenworcel said a statement announcing the fine. "This is unnerving. Because when a caller sounds like a politician you know, a celebrity you like, or a family member who is familiar, any one of us could be tricked into believing something that is not true with calls using AI technology."

State and federal law enforcement officials took the call seriously from the beginning, eager to send a strong message that they would not tolerate misuse of the new technology, while advocates say new rules are needed.

New Hampshire law enforcement officials quickly zeroed in on two out-of-state telecom companies that they said were involved in distributing the robocall and hiding its true origins by "spoofing" recipients caller IDs.

But the creators of the call remained unknown until a nomadic street magician came forward to NBC News.

Paul Carpenter, who holds a world record in straitjacket escapes but no fixed address, said Kramer hired him to create the audio of Biden's voice used in the call. He provided screenshots of text messages and Venmo transactions to corroborate his account.

Confronted with the evidence, Kramer admitted that he commissioned the call, but he insisted he did it only to prompt stricter regulations of AI deepfakes.

He compared himself to American Revolutionary hero Paul Revere to argue he was merely sounding the alarm about coming danger.

“This is a way for me to make a difference, and I have,” he said, adding that he was not worried about potential legal repercussions. “I can tell you they’re not used to me. I wrestled in college.”

Kramer is a veteran get-out-the-vote consultant who has worked mainly for Democrats, especially in New York. At the time, he had a six-figure contract with the campaign of Rep. Dean Phillips, D-Minn., who was running a long-shot primary challenge to Biden.

Kramer and the Phillips campaign both adamantly denied that the campaign had any knowledge of the robocall or directed him to create it.

Phillips dropped out of the presidential race shortly after his poor performance in the New Hampshire primary.

Deepfakes, in which AI is used to impersonate someone, are feared to become a larger part of political campaigns and society in general.

This week, actor Scarlett Johansson accused Open AI of imitating her voice without her authorization as part of a new product launch. The company denied that, but it removed the voice.

This article was originally published on NBCNews.com