AI Deepfakes of True-Crime Victims Are a Waking Nightmare

true-crime-ai-tiktok.jpg true-crime-ai-tiktok - Credit: @touchingstory4u/TikTok
true-crime-ai-tiktok.jpg true-crime-ai-tiktok - Credit: @touchingstory4u/TikTok

“Grandma locked me in an oven at 230 degrees when I was just 21 months old,” the cherubic baby with giant blue eyes and a floral headband says in the TikTok video. The baby, who speaks in an adorably childish voice atop the plaintive melody of Dylan Mathew‘s “Love Is Gone,” identifies herself as Rody Marie Floyd, a little girl who lived with her mother and grandmother in Mississippi. She recounts that one day, she was hungry and wouldn’t stop crying, prompting her grandmother to put her in the oven, leading to her death. “Please follow me so more people know my true story,” the baby says at the end of the video.

The baby in the video is, of course, not real: She’s an AI-generated creation posted on @truestorynow, an account with nearly 50,000 followers that posts videos of real-life crime victims telling their stories. The gruesome story she’s telling is true, albeit to a point. The baby’s name wasn’t Rody Marie, but Royalty Marie, and she was found stabbed to death and burned in an oven in her grandmother’s home in Mississippi in 2018; the grandmother, 48-year-old Carolyn Jones, was charged with first-degree murder earlier this year. But Royalty was 20 months when she died, not 21, and unlike the baby in the TikTok video, she was Black, not white.

More from Rolling Stone

Such inaccuracies are par for the course in the grotesque world of AI true-crime-victim TikTok, a subgenre of the massive true-crime fandom, which uses artificial intelligence to essentially resurrect murder victims, many of whom are young children. The videos, some of which have millions of views, involve a victim speaking in first person about the gruesome details of their deaths; most of them do not have a content warning beforehand.

“They’re quite strange and creepy,” says Paul Bleakley, assistant professor in criminal justice at the University of New Haven. “They seem designed to trigger strong emotional reactions, because it’s the surest-fire way to get clicks and likes. It’s uncomfortable to watch, but I think that might be the point.”

Many of the accounts have a disclaimer stating that the video does not use real photos of victims, as a way to “respect the family,” as Nostalgia Narratives, an account that posts true-crime victim AI videos with 175,000 followers, puts it in the captions to the videos. The account not only tells the stories of famous child murder victims like Elisa Izquierdo, a six-year-old girl who was murdered by her abusive mother in 1995, and Star Hobson, a one-year-old murdered by her mother’s girlfriend in 2020, but also adult murder victims like George Floyd and JFK. None of the accounts that Rolling Stone reached out to responded to requests for comment, but the fact that they change the victims’ appearance is likely due to TikTok community guidelines banning deepfake depictions of private individuals or young people, a policy the platform instituted in March. (A spokesperson for TikTok confirmed to Rolling Stone that @truestorynow had been removed for violating community guidelines.)

The proliferation of these AI true-crime victim videos on TikTok is the latest ethical question to be raised by the immense popularity of the true-crime genre in general. Though documentaries like The Jinx and Making a Murderer and podcasts like Crime Junkie and My Favorite Murder have garnered immense cult followings, many critics of the genre have questioned the ethical implications of audiences consuming the real-life stories of horrific assaults and murders as pure entertainment, with the rise of armchair sleuths and true-crime obsessives potentially retraumatizing loved ones of victims.

That concern applies doubly to videos like the one featuring Royalty, which tell a victim’s story from their perspective and using their name, presumably without the family’s consent, to incredibly creepy effect. “Something like this has real potential to revictimize people who have been victimized before,” says Bleakley. “Imagine being the parent or relative of one of these kids in these AI videos. You go online, and in this strange, high-pitched voice, here’s an AI image [based on] your deceased child, going into very gory detail about what happened to them.”

There’s also potentially thorny legal issues associated with creating deepfake videos, adds Bleakley, likening the rise of the AI true-crime videos to the popularity of deepfake porn. Though there is no federal law making nonconsensual deepfake images and videos illegal, both Virginia and California have banned deepfake pornography, and earlier this month Congressman Joe Morelle proposed legislation making it both a crime and a civil liability for someone to disseminate such images.

Deepfake true-crime videos are different from deepfake porn for obvious reasons, yet Bleakley could see how grieving families may want to pursue civil litigation against creators of such videos, particularly if they are being monetized, though he notes it would be difficult for families to argue on the grounds of defamation due to the subjects being deceased. “It’s a very sticky, murky gray area,” he says.

One thing is clear, however: With AI technology rapidly evolving every day, and little to no regulation in place to curb its spread, the question is not whether videos like these will become more popular, but rather, how much worse the marriage of true crime and AI is going to get. One can easily imagine, true-crime creators being able to not only re-create the voices of murder “victims,” but to re-create the gory details of crimes as well. “This is always the question with any new technological development,” says Bleakley. “Where is it going to stop?”

Update Thursday, May 31, 2023, 4:15 p.m.: This story has been updated with more information clarifying why @truestorynow was banned from TikTok.

Best of Rolling Stone

Click here to read the full article.