Swifties Want a Massive Crackdown on AI-Generated Nudes. They Won’t Get One

  • Oops!
    Something went wrong.
    Please try again later.
  • Oops!
    Something went wrong.
    Please try again later.
Taylor Swift performs onstage during the Eras Tour. - Credit: Taylor Hill/TAS23/Getty Images/TAS Rights Management
Taylor Swift performs onstage during the Eras Tour. - Credit: Taylor Hill/TAS23/Getty Images/TAS Rights Management

As sexually graphic AI-generated images of Taylor Swift flooded X (formerly Twitter) yesterday, causing the phrase “Taylor Swift AI” to trend on the platform, the singer’s fans grew apoplectic. They fumed that the site (whose content moderation team has been all but entirely dissolved by owner Elon Musk) wasn’t acting quickly enough to take down the posts, which violated X’s community guidelines, or ban the accounts responsible. One of the most viral tweets stayed up 17 hours and drew some 45 million views.

In the end, the Swifties were able to get a lot of the AI smut removed through mass-reporting and overwhelm the rest with a furious avalanche of condemnation. But for this warlike fandom, it wasn’t enough, and some expressed hope that the singer’s misfortune would set the stage for a wider crackdown on nonconsensual and invasive AI porn. “The only ‘silver lining’ about it happening to Taylor Swift,” wrote one influencer, “is she likely has enough power to get legislation passed to eliminate it.” Many agreed that such images should be illegal. (A representative for Swift did not respond to a request for comment.)

More from Rolling Stone

The story also attracted the attention of a few lawmakers: Sen. Martin Heinrich of New Mexico tweeted that it was an example of “precisely the risk we’re facing with unregulated AI,” adding that Congress needs to act on the issue. Rep. Tom Kean of New Jersey touted his proposed AI Labeling Act, a bill that would require clear indications and disclosures for material that is AI-generated, as part of a regulatory solution.

Swift’s superstardom, signs of Congressional support, and a highly motivated stan army would seem to promise powerful momentum for any attempt to eradicate these nonconsensual AI nudes. But that crusade will come up against a thorny and forbidding set of complications, according to civil liberty experts — no matter how fired up the Swifties are.

“They’re a huge force, and they advocated,” says Katharine Trendacosta, director of policy and advocacy at the Electronic Frontier Foundation, a nonprofit focused on internet users’ privacy and free expression. “But they did that after Ticketmaster, and we somehow still have Ticketmaster,” she adds, referring to Swifties savaging the company as a price-gouging monopoly (and in some cases even filing lawsuits) due to its mishandling of ticket sales for Swift’s Eras Tour. In the AI fight, too, Trendacosta says, we’ll see “the unstoppable movement of the Swifties versus the immovable object that is the legislature,” a Congress slow to respond to “basically anything.”

“The problems with the internet are always problems of scale and exposure,” Trendacosta says, noting that explicit imagery of celebrities is nothing new: from drawing them naked to Photoshopping their faces onto nude bodies to more sophisticated deepfake videos, the famous have long been vulnerable to our darkest imaginings. The difference today, she explains, is a matter of “how fast and how much” we’re seeing, with AI software allowing relatively few people to churn out an astonishing volume of content for a vast, hyperconnected audience. A 404 Media investigation found that the Swift pictures seem to have been leaked from a Telegram group using the Microsoft app Designer to craft abusive images of real women, their handiwork then spreading across social media and celebrity nude sites.

Reform and government oversight, however, is difficult, Trendacosta says, not least because legislators’ ideas of how to combat deceptive AI have been all backwards. The EFF, for instance, opposes the No Artificial Intelligence Fake Replicas And Unauthorized Duplications (No AI FRAUD) Act, introduced by Reps. María Elvira Salazar of Florida and Madeleine Dean of Pennsylvania earlier this month. Why? Because in seeking to guarantee “individual property rights in likeness and voice,” the proposed law would broaden publicity rights — that is, your right to not have a company falsely claim you endorse their product — to any kind of digital representation, “from pictures of your kid, to recordings of political events, to docudramas, parodies, political cartoons, and more,” as EFF notes in a statement on the bill. Other critics have also warned of the chilling effect this would have on digital free speech. Under its expansive language, sharing a Saturday Night Live clip that features an impression of Swift would potentially be a criminal offense.

“I know several legislators are attempting to either write new bills, or adjust existing laws around revenge porn to prosecute it, but much of this is incredibly new,” says Mike Stabile of the Free Speech Coalition, the trade association of the U.S. adult entertainment industry. “As detestable as [nonconsensual AI porn] might be, it’s still technically speech, and efforts to curtail it or ban it may face hurdles in the courts.”

“In the short term, platforms are the best tool in blocking widespread distribution,” Stabile says, adding that adult sites including Pornhub and Clips4sale “have been ahead of the pack on this, and banned deepfakes and revenge porn years ago.” Of course, these rules depend on enforcement — and that, according to Trendacosta, can be an insurmountable task in itself.

“The problem we often see with with the largest companies, like on Facebook or on Google or even Twitter, which isn’t even that big, is that the enforcement is really selective because they have so much content,” she says. “It’s actually just impossible.” Incidents like the sudden proliferation of AI-spawned illustrations of Swift in sexual scenes will draw the most focus and garner a relatively quick response, whereas “the already victimized or marginalized” receive little help, if any, Trendacosta says. The outcry over Swift’s admittedly terrible situation has far outstripped, for example, concern for kids whose pictures are fed into AI models to create child sex abuse material.

Plus, Trendacosta points out, there are practical limits to the engineering side of the equation. People want to believe that “if the problem is the technology then the technician should be able to fix it by building a new technology,” she says, but this doesn’t get to the systemic roots of the problem. The Microsoft software used to create pornographic images of Swift has guardrails meant to prevent exactly this kind of misuse; bad actors found ways around them. Neither can we completely rely on filtering tech to catch platform violations. “Machines don’t understand context,” Trendacosta says. “If I draw a politician semi-nude to make fun of him, that’s protected political speech. Machines don’t know that.”

So while it’s easy to establish a general consensus that it’s wrong to disseminate AI porn that victimizes a pop star, the question of how we could prevent it while guaranteeing the same protections for average citizens — and preserving First Amendment rights — is very much unsettled. On the one hand, our technologies and the human teams behind them aren’t up to the task. On the other, government overcorrection might leave us with heavily restricted social networks that close off legitimate forms of commentary.

Which is not to doubt the tenacity of the Swifties eager to tackle the scourge of AI nudes on her behalf. It just means that we won’t see a seismic shift in the immediate future, and that for all Swift’s influence on the culture, some things remain beyond her control.

Best of Rolling Stone