The World Thought This Cheer Mom Created a Deepfake to Harass Her Daughter’s Rival—but the Real Story Is Way More Confusing (and Bizarre)

Photo credit: Hearst Owned
Photo credit: Hearst Owned

Doylestown, Pennsylvania—population 8,300—is just an hour north of downtown Philadelphia, close enough to share Philly’s accent and an occasional FM radio signal. But it’s otherwise a different universe. Country-road billboards market deer-proof plants; a sign advertising a “Goats N Hoes” farm may or may not be ironic. Just east of the airport, tucked away on the outskirts of town, the training facility of the Victory Vipers All-Stars—a local competitive cheer squad—is situated in a vinyl-sided industrial warehouse next to an auto parts store.

On a typical weekday evening, a steady stream of minivans and SUVs cycles in and out of the parking lot, lining up to deposit or collect ponytailed, water-bottle-toting athletes in T-shirts and cheer shorts. Team sports are life here in Bucks County, where one player’s town might be a half-hour drive from another’s. Your squad becomes your world.

Last year, before the pandemic temporarily closed the gym, Madi Hime was 16, a Victory Viper who had recently joined the team. She fit right in with the other cheerleaders, documenting their lip syncs and dance routines on Instagram and TikTok (where Madi has almost 100,000 followers). Even during the competition off-season, the Vipers spent a lot of time together: Cheer training happens year-round, typically three days per week. Athletes need to maintain peak conditioning in order to pull off precise aerial stunts, and they need to work as a group. Any false move or miscommunication can cost points in a high-stakes competition. It can also mean a sprained ankle, a fractured wrist, or worse.

Madi was both a “flyer” and a “base,” meaning she alternated between soaring through the air and helping support others who did. Her teammate Allie Spone was a full-time flyer—a more exalted designation on the cheer scene. But although intra-team competition could be intense, the girls got along just fine, says Allie. She was even at the sleepover where the video that would soon make national headlines was allegedly filmed.

Photo credit: Hearst Owned
Photo credit: Hearst Owned

It was a particularly tense period—July 2020, when the gym was struggling to stay open, stave off COVID-19, and hold team tryouts. And then one of the Vipers coaches got an anonymous text. The footage was grainy and lo-res, but the subject was unmistakable: Petite and blonde, it was Madi Hime, nonchalantly pulling from a blue vape pen, releasing a controlled plume of artificially flavored smoke into the air. The angle was too close to make out where she was, or with whom, but she seemed to know she was being filmed. Her eyes flitted toward the camera and she laughed, looking casual, relaxed, a little defiant.

The vaping video was a clear violation of the Victory Vipers All-Stars’ code of conduct. But when Madi was confronted, she denied that the video was of her. “I went in the car and started crying,” she later recalled on Good Morning America. “I was like, ‘That’s not me on video.’ I thought if I said it that no one would believe me, because obviously there’s proof. It’s a video. But the video was obviously manipulated.”

Madi’s mother, Jennifer Hime, believed her and reported the video to local police. She added that her daughter had been receiving threatening messages for the past month from an unknown number—including ones saying things like “you should kill yourself”—as well as photos of herself seemingly doctored to make her appear nude and drinking alcohol. Two other squad moms also reported receiving anonymous messages: One got photos of her daughter in a bikini with captions written over the images regarding “toxic traits, revenge, dating boys, and smoking,” per the criminal complaint. Another got a text saying her daughter was “drinking at the shore, smokes pot, and uses ‘AttentionWh0re69’ as a screen name.” Police traced the texts to the messaging platform Pinger. It led to a single, unexpected source: Allie’s mom, Raffaela Spone.

Raffaela was eventually arrested on six counts of misdemeanor harassment and cyber harassment of a child. But that’s not what spun the case into an international scandal. In the criminal complaint, Hilltown Township police officer Matthew Reiss declared that the video of Madi vaping had the hallmarks of a “deepfake,” “where a still image can be mapped onto an existing video and alter the appearance of the person in the video to show the likeness of the victim’s image instead.” In his telling, he’d arrested a middle-age suburban mom for wielding the power of advanced AI technology against her daughter’s competition, creating a fake video so uncannily convincing that it could have gotten Madi kicked off the team.

Photo credit: Hearst Owned
Photo credit: Hearst Owned

For a small-town police department—and a district attorney up for reelection—the idea was far too tantalizing to keep quiet. During a press conference in March of this year, Bucks County DA Matt Weintraub announced the allegation to the world, proclaiming, “This tech is now available to anyone with a smartphone. Your neighbor down the street, somebody who holds a grudge, we just have no way of knowing.” It’s also “another way for an adult to now prey on children,” he added.

It all seemed a little Drop Dead Gorgeous—the ’90s cult classic about a mom on a beauty pageant murder spree—but make it cheer and updated for the digital age. Because who needs the threat of physical violence when it’s possible to ruin someone’s life using nothing but social media photos and a phone? The Himes appeared on the Today show and GMA, where their ordeal was framed as a warning to parents everywhere. Raffaela’s mug shot ran on websites across the internet.

It took just two months for the DA’s case to completely unravel.

Even if you’ve never heard of a deepfake before, you’ve seen the technology behind it in action. Think: apps like Face Swap Live and Reface that transpose your friend’s face onto Ariana Grande’s body to hilarious effect. Or the uncanny series of “Tom Cruise” videos that took TikTok by storm earlier this year, merging publicly available images of the actor’s face with footage of a professional Tom Cruise impersonator close-talking to the camera and teeing up a golf swing.

It’s all so new that the term didn’t even exist until 2017, when a Reddit user coined it to describe mostly crude, mostly pornographic videos that superimposed celebrity faces onto adult performers’ bodies. The images have gotten more realistic as the technology has gotten smarter—but contrary to Weintraub’s claims, making a truly convincing deepfake right now isn’t as simple as watching a YouTube tutorial and photoshopping a few pics. In reality, you need specialized AI software that uses machine-learning algorithms to collect data from hundreds, even thousands, of pictures of somebody’s face (which is significantly easier for celebrities, who have pages and pages of Google Images search results, than for the average person). The program then creates a digital copy of that face and can replicate its distinct movements: the curling of a lip, the arching of an eyebrow.

Photo credit: Hearst Owned
Photo credit: Hearst Owned

In 2021, it’s “very, very unlikely” that, say, a cheer parent with no tech expertise would be able to wield this technology to produce an authentically convincing deepfake, says Henry Ajder, a leading researcher and policy adviser on deepfakes, disinformation, and media manipulation. He estimates that the process would require sophisticated equipment and take even the most highly skilled artist months. And he saw plenty of issues with the Madi video: the awkward angles, the smoke, the vape pen hovering over her face—these would be borderline impossible to fake, he says, period.

“I’m not saying that this mom didn’t do anything, but to me, it’s a fact that this video is real and that this girl was just denying that she was in this video,” says Chris Umé, the Belgian visual effects and AI artist behind the Tom Cruise clips. After the video went viral, Ajder took to Twitter to call it a fake deepfake, and many of the world’s preeminent experts, including Umé, agreed. Ajder is especially critical of media outlets for running with the story without bothering to speak to any experts, let alone ask police how they assessed the footage. According to him, there are only a handful of people in the U.S. capable of properly vetting a deepfake, “using specific computational forensic techniques, going through it frame by frame to comb for clues to be able to say with authority if it is real or not.” When the Daily Dot eventually asked Reiss, the police officer, whether he ran a metadata analysis on the video, he admitted that he’d simply made a “naked eye” judgment call.

The sloppy police work didn’t end there: The criminal complaint against Raffaela details threatening texts she allegedly sent to Madi—but the department has never provided evidence of their existence to Raffaela’s legal counsel or to the public. At a hearing in July, a digital-forensics expert who’d made a complete copy of Raffaela’s confiscated phone testified that it contained no deepfakes, pictures of any of the girls, texts to the girls, or apps used to create deepfakes or nudes. (A Hilltown Township Police detective admitted on the stand that he had never even bothered to look at Madi’s phone.) It also came to light that one of the supposed “harassing” texts received by Madi (which contained the words “bark bark ruff ruff” and poop emojis) had actually been traced to…a 17-year-old boy named Ethan.

Unrelated but even more disturbing: Early in the proceedings, Reiss was arrested on multiple felony counts of possession of child pornography; he resigned from the department, but his flawed report remains on record. This spring, after the “cheerleading deepfake” story had already gone global, the Bucks County DA’s office quietly dropped the deepfake accusation while continuing to pursue the less sensational harassment charges against Raffaela. And in a dramatic turnaround, both the DA and the police department have gone silent, declining to answer questions about the case. (When contacted for this story, the only response was a DA spokesperson’s denial that media manipulation was the reason for their charges.) After initially taking their story to morning shows, Madi and Jennifer have also stopped talking and did not respond to multiple requests for comment on this story.

With the deepfake allegations off the table, it’s unlikely that the vaping video will ever see its day in court. But experts maintain that the clip would have been nearly impossible for Raffaela to fake, which leaves us, in all likelihood, here: A teen—like practically any other teen in history—broke the rules, got caught, didn’t want to get in trouble, and so denied it. The real trouble came when a local police department was so eager to believe the innocence of a blonde-haired, blue-eyed cheerleader that they blinded themselves to the facts.

Photo credit: Hearst Owned
Photo credit: Hearst Owned

There is no one who looks particularly virtuous in this disastrously mishandled saga, just idiotic law enforcement, a gullible press and public, a mom who (allegedly) sent questionable tattling and malicious messages, and a couple of teens who did not ask for any of this but will now have their names attached to a notorious case and possible lifetime of unfortunate SEO. The actually real—and terrifying—thing? In the future, those teens could be any of us.

Experts predict that in as soon as five or six years and with more user-friendly deepfake tech, even amateurs will be able to create a decent counterfeit video. And as the cheer case foreshadows, these videos won’t even have to be that good to implicate someone in a crime or exonerate them of one—because police and local prosecutors aren’t trained to tell the difference.

This phenomenon has been dubbed the “liar’s dividend,” the idea that the existence of deepfakes can be weaponized to undermine the truth. As Ajder explains, “Deepfakes don’t just make things look real. They also provide a way to dismiss real things as fake.” Umé adds—and the more mainstream the concept becomes, the bigger its risk—that “this is something we’re going to face a lot in the future, people denying that they were being shown on video, even if there’s video proof.” Which is not to say that Madi did anything especially shocking by denying she had vaped on video. But it is to suggest that what happened to Madi points to a scary future in which all digital evidence is essentially suspicious—and the only truth is the one each of us (or the internet) chooses to believe.

Insidious actors are already using this to their advantage. After the video of George Floyd’s murder by police officer Derek Chauvin went viral in May 2020, Winnie Heartstrong, a former Republican congressional candidate from Missouri, cooked up a deepfake conspiracy to discredit the footage, claiming that Floyd had actually died years earlier and that his face had been mapped onto the body of a former NBA player to stir up racial tensions. The lie didn’t stick, possibly because the concept of deepfakes was still relatively unfamiliar. But that’s changing fast—along with the technology itself—which will only fuel confusion and blur lines with potentially catastrophic results, especially when it comes to legal cases involving things like police brutality, revenge porn, and even murder.

Photo credit: Hearst Owned
Photo credit: Hearst Owned

“All it takes is for one or two high-profile cases involving deepfakes to go awry for other cases to be tainted,” says New York attorney Annie Seifullah, whose work focuses on bringing justice to victims of harassment and abuse. “Once we have this doubt, this poison that’s been poured in our ear that it could be manipulated, then we really are in another level of the post-truth world.” That law enforcement is woefully unprepared for this new world cannot be overstated. “The systems and the people who are meant to protect us against false accusations or inspect evidence to see if it’s real are not equipped to do this,” adds Seifullah. “It’s just so scary to think about.”

In the end, Madi left the Victory Vipers. She now cheers at a gym nearly an hour and a half away, across the Delaware River in a cushier enclave of suburban New Jersey.

Allie quit cheerleading altogether. She says she misses it sometimes, but after everything that’s happened, she can’t imagine going back. (The two girls haven’t communicated since last year.) The harassment case against her mother drags on. As of mid-August, Raffaela’s counsel was awaiting a judge’s decision on the future of the case. It could go to trial or be dropped completely. The cloud of doubt, though, will linger.

Raffaela herself denies any wrongdoing and declined to comment for this story. But her attorney did offer this: If Raffaela had been guilty, if she were truly going to doctor a video to make it look incriminating, why make it about vaping?

Why not something worse?

Photo credit: Hearst Owned
Photo credit: Hearst Owned

You Might Also Like