A bot on the messaging app Telegram is able to virtually undress clothed women to create more than 100,000 deepfake nude photos, according to a new report.
The women are real — and the photographs often come from their social media profiles.
“Having a social media account with public photos is enough for anyone to become a target,” Giorgio Patrini, chief executive officer and chief scientist at Sensity, told the BBC.
Sensity is an intelligence company that tracks and exposes deepfakes and other forms of “malicious visual media,” according to its website. In a 12-page report published this month, the company outlined how a new “deepfake ecosystem” evolved on Telegram based on an “AI-powered bot that allows users to photo-realistically ‘strip naked’ clothed images of women.”
The bot is free to use on smartphones and computers and is easily accessible via Telegram, an instant messaging app developed in 2013 that promises secure messaging using end-to-end encryption. Telegram is banned in Russia, China and Iran.
According to the report’s findings, “stripped” pictures of 104,862 women had been shared on the app by the end of July.
How it works
The “DeepNude” software likely being used by the bot on Telegram is able to generate a “realistic approximation of (women’s) intimate body parts,” Sensity found.
Users can upload a photograph to Telegram in the same way they might send pictures on Facebook Messenger or another instant messaging app, according to the report. After a few minutes of processing, the app returns a “stripped’ image, which can be downloaded or forwarded.
The process only works on photographs of women, Sensity said.
Users also have the option to pay a small fee to remove watermarks and skip ahead of free users waiting for their images to generate, the report states.
Who is using it
Sensity believes upward of 100,000 people are using the bot on various Telegram channels based on the number of subscribers. At least 45,615 unique members are using just the “central hub” channel — a figure that climbed significantly between June 19 and July 19, the report shows.
A poll of the bot’s users indicated the vast majority were from Russia, Ukraine, Belarus and other former USSR countries. But about 3% said they were from the U.S., the UK, Canada or Australia. At least 6% were from Spain and Latin America, and 8% said they were “nowhere.”
Many new users were drawn in by advertisements on VK, the “largest social media platform in Russia,” Sensity said.
In a statement Wednesday, a VK spokesperson told McClatchy News they haven’t received the report or any warning from Sensity regarding the deepfake bot.
“VK doesn’t tolerate such content or links on the platform and blocks communities that distribute them,” the statement reads. “We will run an additional check and block inappropriate content and communities. Also, please note that such communities or links were not promoted using VK advertising tools.”
Who is being targeted
Despite a primarily Russian user base, Sensity said the women whose images are being used come from all over the world — including Argentina, Italy, Russia and the U.S.
While deepfakes have often been used on celebrities and politicians, Sensity’s poll of the Telegram bot’s users found 63% were primarily “interested to undress” women they “know in real life.” Most of the images the company found “appeared to be taken from social media pages or directly from private communication, with the individuals likely unaware that they had been targeted.”
Patrini told BuzzFeed News the bot just needs one image, “which is really a reason why so many private individuals are attacked, because only one profile picture from Facebook is enough to do this.”
The bot has also been used to share child pornography with fake nude photographs of underage girls, Sensity’s report found.
The intelligence company speculated people have used the images for “public shaming or extortion-based attacks,” meaning users share the deepfake nude photo on social media, send it to people’s relatives and friends or threaten to publish the picture in exchange for money.
Nina Jankowicz, author of “How to Lose the Information War,” told BuzzFeed the technology is disturbingly accessible.
“Essentially, these deepfakes are either being used in order to fulfill some sick fantasy of a shunted lover, or a boyfriend, or just a total creepster,” she said, according to BuzzFeed. “Or they’re used as potential blackmail material.”