Fake explicit Taylor Swift photos have politicians sounding off. But will AI laws actually change?

  • Oops!
    Something went wrong.
    Please try again later.
  • Sexually explicit AI-generated images of Taylor Swift went viral on X and Telegram this week.

  • It's sparked a renewed call for legislation to combat AI and deepfakes' threats.

  • Politicians introduced two bills to do this just last year, which have been largely ignored.

Taylor Swift is the latest celebrity to fall victim to artificial-intelligence misuse, with her likeness being used in a series of sexually explicit posts that went viral on X and Telegram this week.

It's sparking renewed calls for legislation to combat the threats AI and deepfakes pose.

Pornographic images of the pop star began circulating on the social-media platform on Wednesday. Many of them portrayed Swift nude and engaging in sexual acts in football stadiums, seemingly in reference to her recent appearances at NFL games.

The Verge reported that one post, uploaded by a user who'd paid for X's blue check, attracted more than 45 million views and 24,000 reposts before X's moderators took it down 17 hours later.

The mass proliferation of the images has prompted discussions about the increasingly alarming spread of AI-generated content and misinformation online, with many politicians arguing that it's high time to introduce a federal law to tackle the issue.

According to NBC News, since the beginning of the year, lawmakers have introduced legislation in at least 14 states to combat the issues that AI and deepfakes can create in relation to elections, which seems particularly timely given that New Hampshire voters received bogus calls claiming to be from US President Joe Biden this week.

But Democratic Rep. Joseph Morelle has proposed a bill that would specifically criminalize the nonconsensual sharing of sexually explicit digitally altered material across the US.

In May 2023, Morelle introduced the Preventing Deepfakes of Intimate Images Act, which seeks to make it illegal to share deepfake pornography without consent. If passed, it would also allow victims to sue the creators and distributors of such material while maintaining anonymity.

The bill was referred to the House Judiciary Committee, but no further action has been taken in the last eight months.

Now, Morelle is just one of the voices calling for urgent action.

Posting on X on Thursday, he wrote: "The spread of AI-generated explicit images of Taylor Swift is appalling — and sadly, it's happening to women everywhere, every day."

Rep. Tom Kean Jr. of New Jersey — who became the first Republican cosponsor of Morelle's bill in November after an incident affecting a high schooler in his hometown — stated it is "clear that AI technology is advancing faster than the necessary guardrails."

He added: "Whether the victim is @taylorswift13 or any young person across our country — we need to establish safeguards to combat this alarming trend."

Others are speaking up, too, including Democratic Rep. Yvette D. Clarke of New York, who has attempted to push through a separate Deepfakes Accountability Act that would impose regulations around the creation of AI-generated content. That bill also hasn't made it past the first hurdle.

"What's happened to Taylor Swift is nothing new," she posted on X, noting that AI-generated nudes have victimized women for years and "advancements in AI, creating deepfakes is easier & cheaper."

Swift herself has not reacted publicly to the images, and a representative for the singer did not respond to a request for comment on the situation from Business Insider. The Daily Mail has reported that her team is considering legal action against the site that created the AI-generated images.

Even without commenting, Swift's position as one of the most famous women in the world has perhaps done enough to shine a light on this particularly nefarious — and increasingly common — form of sexual harassment.

According to the State of Deepfakes report published in 2023, over 95,000 deepfake videos were posted online last year, representing an increase of 550% since 2019.

The report also found that deepfake pornography makes up about 98% of all deepfake videos online, and women are disproportionately targeted.

Due to the lack of existing legislation, many women who have been in Swift's position have been left with little recourse after reporting the creators and distributors of their own deepfakes.

Now with Swift as its latest victim, AI pornography may finally face consequences.

Read the original article on Business Insider