Political consultant faces $6 million fine for AI robocalls

Paul Carpenter, a New Orleans magician, describes using his computer and AI software during an interview in New Orleans, Friday, Feb. 23, 2024. Carpenter says he was hired in January by Steve Kramer, who has worked on ballot access for Democratic presidential candidate Dean Phillips, to use AI software to imitate President Joe Biden's voice to convince New Hampshire Democratic voters not to vote in the state's presidential primary.
  • Oops!
    Something went wrong.
    Please try again later.

The political consultant that used deepfake generative artificial intelligence to mimic President Joe Biden’s voice and send robocall messages two days before the New Hampshire presidential primary now faces more than two dozen criminal charges and a $6 million fine.

Steven Kramer, a longtime Democratic political operative who was working with Biden’s primary challenger, Rep. Dean Phillips, has admitted to being behind a robocall message sent to voters in New Hampshire. The robocall, in an AI-generated message that sounded like Biden’s voice, falsely insinuated that voting in the New Hampshire presidential primary would mean voters could not vote in November.

Kramer is now charged with 13 felony counts of voter suppression and 13 misdemeanor counts of impersonation of a candidate, across four counties, according to the New Hampshire Attorney General. The criminal charges against Kramer — filed in Belknap, Grafton, Merrimack and Rockingham counties — are each tied to a specific voter and allege that he “knowingly attempted to prevent or deter” each voter from voting “based on fraudulent, deceptive, misleading or spurious grounds or information.”

“Two days before the New Hampshire 2024 presidential primary election, illegally spoofed and malicious robocalls carried a deepfake audio recording of President Biden’s cloned voice telling prospective voters not to vote in the upcoming primary,” wrote the Federal Communications Commission in a statement released Wednesday.

The FCC said the fine it proposed for Kramer is its first involving generative AI technology. The company accused of transmitting the calls, Lingo Telecom, also faces a $2 million fine, although in both cases the parties could settle or further negotiate.

“We will act swiftly and decisively to ensure that bad actors cannot use U.S. telecommunications networks to facilitate the misuse of generative AI technology to interfere with elections, defraud consumers, or compromise sensitive data,” said Loyaan A. Egal, chief of the Enforcement Bureau and chair of the Privacy and Data Protection Task Force. “We thank our partners at the New Hampshire Attorney General’s Office for their help with this investigation.”

FCC Chairwoman Jessica Rosenworcel said regulators are committed to helping states go after perpetrators. In a statement, she called the robocalls “unnerving.”

“Because when a caller sounds like a politician you know, a celebrity you like, or a family member who is familiar, any one of us could be tricked into believing something that is not true with calls using AI technology,” she said in a statement to media. “It is exactly how the bad actors behind these junk calls with manipulated voices want you to react.”

She also said that the FCC actions were “only a start,” because “AI technologies that make it cheap and easy to flood our networks with fake stuff are being used in so many ways here and abroad.”

Shortly after New Hampshire’s primary, the agency outlawed robocalls that contain voices generated by artificial intelligence. A bill requiring disclosure of the use of AI in audio or visual political ads was passed in Utah’s 2024 legislative session and took effect May 1. The bill provides for a private right of action and penalty of $1,000 per violation.