When we talk about deepfakes, the term used to describe a type of digitally manipulated videos, most of the discussion is focused on the implications of deepfake technology for spreading fake news and potentially even destabilizing elections, particularly the upcoming U.S. 2020 election. A new study from Deeptrace Labs, however, a cybersecurity company that detects and monitors deepfakes, suggests that the biggest threat posed by deepfakes has little to do with politics at all, and that women all over the world may be at risk.
According to the study, which was released Monday, the vast majority of deepfakes on the internet — nearly 96% — are used in nonconsensual porn, meaning that they feature the likenesses of female subjects without their consent. Additionally, the study sheds light on who, exactly, is most often being featured in such content. Although the majority of pornographic deepfake subjects (41%) are of British or American actresses, nearly one-quarter (25%) of the women targeted are of South Korean descent and are classified by the researchers as South Korean musicians or K-pop singers. (The researchers chose not to name the individuals most-often targeted by pornographic deepfakes out of concern for their privacy.)
More from Rolling Stone
- Watch Bill Hader Become Tom Cruise, Seth Rogen in Eerie Deepfake Video
- Facebook Says It's Putting an End to Revenge Porn Once and For All
- Head in the Clouds: Rich Brian, K-Pop Group iKON at 'Asian Coachella'
The overrepresentation of K-pop musicians speaks to the increasingly “global” reach of deepfakes, says Henry Ajder, head of research analysis for Deeptrace Labs. Indeed, in a Twitter thread, Deeptrace CEO Giorgio Patrini said in July that K-pop deepfakes have long been an “early trend” in AI, and are most often though not exclusively used in pornographic deepfakes.
K-pop stars are likely so overrepresented due to the explosive global popularity of K-pop in general, with estimates suggesting that the rise of bands like BTS and Blackpink have led to it becoming a more than $5 billion global industry; the fact that pornography is illegal in South Korea, with nearly all online pornography websites currently blocked by the government, also probably plays a role.
Interestingly, Ajder says, the data shows that the majority of users in the online forums generating deepfakes aren’t from South Korea, but China, which plays host to one of the biggest K-pop markets in the world. This is in spite of diplomatic relations between the two countries being strained in recent years, with major Korean artists being unable to perform in China since 2016.
It could be argued that the unique form of sexualization to which female K-pop musicians are subjected — while many are not allowed to date or speak openly about their sex lives, at least one study has shown that they are sexually objectified far more often than their male counterparts — may be contributing to why they are disproportionately being portrayed in deepfakes. “I do wonder if the creation of deepfake images of K-pop idols are done by their anti-fans,” says Hye Jin Lee, PhD, clinical assistant professor at the Annenberg School for Communication and Journalism at the University of Southern California, whose academic interests include K-pop and global culture. “Considering that K-pop is all about image (particularly for female K-pop idols whose squeaky-clean image is a must to maintain their reputation), nothing would bring greater satisfaction to [male] anti-fans …than tarnishing the reputation of the K-pop idols and humiliating them in the process.”
Deepfakes in general are still relatively difficult to make, requiring a certain level of coding proficiency and fairly high-grade computer hardware, says Ajder. Yet the rise of businesses and services catering to those interested in deepfakes — essentially, by allowing users to submit an image of a person, then generating a video with the person’s head on a pornographic actress’s body — has helped to “increase accessibility” of deepfake technology for those making nonconsensual porn, says Ajder.
“Deepfakes started off as synonymous with deepfake pornography,” Ajder tells Rolling Stone. “The dialogue has certainly changed to include a lot more things: cybercrime, synthetic impersonation for things like fraud and hacking. The conversation has diversified and I think rightly so … But [deepfake] porn is still the most impactful and damaging area that we can tangibly measure.”
This, ultimately, is the major takeaway from the Deeptrace Labs study: Despite our fears of our political processes being undermined by this new and terrifying technology (and despite federal and state legislation increasingly being introduced to combat this threat), it is still used more often as a way to humiliate and subjugate women than any other fashion.
“We recognize there is significant potential [for deepfakes] to cause political disruption and endanger the political processes,” says Ajder, adding that the Deeptrace study cites numerous examples in other countries like Gabon and Malaysia where the mere question as to whether video footage was digitally manipulated threw national political discourse into tumult. But the data makes clear that “deepfakes are already harming thousands of women online. This is hurting people in a different way,” says Ajder.
Update Tues., Oct. 8, 2019, 12:01 p.m.: This story has been updated to include comment from Hye Jin Lee, PhD.
Best of Rolling Stone
- Coming of Age: Millennials' Most Earth-Shaking Sexual Moments
- Sci-Fi & Fantasy at Emmy Awards: Who's Won, Who's Been Fracked
- Charles Manson: How Cult Leader's Twisted Beatles Obsession Inspired Family Murders
See where your favorite artists and songs rank on the Rolling Stone Charts.