RI lawmakers move to ban political 'deepfakes' ahead of elections. What that means.

PROVIDENCE – To people of a certain age, the phrase – "Is it live or is it Memorex?" – needs no explanation.

Memorex famously claimed that its taped cassette recording of Ella Fitzgerald hitting a high note was so good it could break a glass, just as her live singing would do. And no one would know the difference.

With the advent of artificial intelligence (AI), that question – is it real or is it fake – has leapt from the advertising sphere to the campaign sphere with a potential so frightening to some Rhode Island legislators that they have introduced a bill to ban what they call "deceptive and fraudulent synthetic media" in the 90-day run-up to any election.

Modeled after a state of Washington version, their bill is up for a committee vote on Tuesday on its way to a full House debate.

What the bill does:

The legislation [H7487] defines "synthetic media” as "an image, an audio recording, or a video recording of an individual’s appearance, speech, or conduct that has been intentionally manipulated ... [with] digital technology to create a realistic but false image, audio, or video" that is false.

The legislation would not only ban "deepfakes," it would give a candidate who felt wronged the right to seek an injunction and damages in court. The exception to the ban: if the spot contains a clearly written or spoken disclosure that the image "has been manipulated or generated by artificial intelligence."

Why is the bill needed?

Secretary of State Gregg Amore told legislators at a hearing late last month that so-called deepfakes have been used to deceive the public about statements and actions taken by political leaders in the run up to elections, "when there is not sufficient time for candidates to debunk these mistruths before voters head to the polls."

A recent example, he said, was the falsified Biden robocall in New Hampshire, in which a manipulated version of Biden's voice told voters to stay home and not vote in the New Hampshire primary.

Secretary of State Gregg Amore gives Classical High students a mini history lesson.
Secretary of State Gregg Amore gives Classical High students a mini history lesson.

According to Amore, the legislation creates a balance "between preventing misinformation and protecting the First Amendment, with allowances for Constitutionally-protected speech like press coverage, satire, and parody."

Rep. Jon Brien, one of the co-sponsors of the proposed new ban, said the ubiquitous cartoon caricatures of yore were clearly fake. Today's deepfakes are not so easy to spot.

Arguments against the bill

The ACLU of Rhode Island cautioned the state's lawmakers against "trying to quickly regulate this new world of artificial intelligence and its impact on the electoral process."

"In order to ensure that debate on public issues is, in the words of the U.S. Supreme Court, 'uninhibited, robust, and wide-open,' the First Amendment provides special protection to even allegedly false statements about public officials and public figures," said ACLU Rhode Island Director Steve Brown.

"To allow the government to regulate or ban political speech that some might view as misleading undermines the breathing space that robust political speech requires, whether generated with the help of artificial intelligence or not," he warned.

He gave two examples:

  • A political ad that strings together a politician's comments made at different times that someone could claim is "deceptive" of the candidate's views.

  • A video of a candidate or elected official giving an actual speech where someone, using AI, replaces the real background of the video with an artificial background depicting hell.

Though the bill contains an exception for "satire" or "parody," Brown noted, the use of AI to make these images or recordings could open a citizen to substantial penalties.

A lobbyist for the Computer & Communications Industry Association (CCIA) suggested the lawmakers tweak the bill to make clear the "creator" facing potential penalties mean the person who "deployed" the fake, not "the provider or developer of any technology used in the creation of synthetic media."

This article originally appeared on The Providence Journal: RI lawmakers want to ban political 'deepfakes.' What would change?