The not-so-hidden dark side of child influencers

 Photo collage of Shirley Temple as a child, surrounded by phones and ring lights, with Shirley Temple dolls to the side. Above her looms a pair of adult hands holding a toy camera.
Photo collage of Shirley Temple as a child, surrounded by phones and ring lights, with Shirley Temple dolls to the side. Above her looms a pair of adult hands holding a toy camera.

In an era of influencers and personal brands built on social media, it's more possible than ever to achieve online fame. While platforms like Instagram do not technically allow children under 13 to have their own accounts, some parents help out by running their kids' pages for them, hoping to aid in their quest to become influencers, models or actors.

Unfortunately, this kind of ambition can lead to some bad places, and a recent investigation by The New York Times exposed the dark underbelly of the world of child influencers on Meta's platforms. What begins as a parent's best effort to jump-start their kids' career can "quickly descend into a dark underworld dominated by adult men, many of whom openly admit on other platforms to being sexually attracted to children." The 5000 mom-run accounts examined by the Times also offered "disturbing insights" into how social media is "reshaping childhood" with "direct parental encouragement and involvement." And if that wasn't alarming enough, there is evidence that Meta has known about this toxic practice for years.

'God bless Instamoms'

The driving force behind child influencers is their parents. The Times discovered that some sell pictures, exclusive chat sessions, and even the "girls’ worn leotards" to mostly unknown male followers. Kidfluencers can make up to six figures monthly from subscriptions and other interactions with their fans, and larger followings will impress more prominent brands. Instagram's algorithm then rewards them with greater visibility.

An audience demographic firm found 32 million adult male followers among the 5000 child influencer accounts the Times examined. Another analysis using image classification software indicated that suggestive images are most likely to garner 'likes' and comments. And it seems that interacting with a primarily male audience opens the door for these kids' abuse. Sometimes, the men "flatter, bully and blackmail" girls and their parents to elicit racy pictures. The outlet also monitored exchanges on the messaging app Telegram, where men talk openly about their desire to abuse children they follow on Instagram. "It's like a candy store 😍😍😍," one of them said, per the Times. "God bless instamoms 🙌," said another. Account owners said that when they report explicit images or possible predators to Instagram, they are "typically met with silence or indifference."

Parents are finding it challenging to keep their kids away from this threatening side of Instagram. "I really don’t want my child exploited on the internet," Kaelyn, a mother in Melbourne, Australia, who agreed to be identified only by a middle name to protect the privacy of her child, said to the Times. "But she’s been doing this so long now. Her numbers are so big. What do we do? Just stop it and walk away?"

Meta's responsibility

In response to the Times' investigation, Meta spokesman Andy Stone said that parents were ultimately responsible for the content on their Instagram accounts and could delete them at any time. “Anyone on Instagram can control who is able to tag, mention or message them, as well as who can comment on their account,” Stone said, pointing to a feature that allows parents to ban comments with certain words. Still, there is evidence that Meta, Instagram’s parent company, was well-aware of the risks. The company found 500,000 child Instagram accounts that have “inappropriate” interactions daily, according to an internal 2020 study quoted in legal proceedings.

In an exclusive story, The Wall Street Journal said that safety staff at Meta warned higher-ups about the fact that new paid subscription tools on Facebook and Instagram were being abused by adults who were profiting from exploiting their children. One year ago, two teams raised alarms in internal reports after discovering that hundreds of what Meta calls “parent-managed minor accounts” were using the feature to sell exclusive content to paying customers. While the images did not involve nudity, Meta staffers found evidence that some parents knowingly produced content for "other adults' sexual gratification." They also found evidence that some parents engaged in "sexual banter" about their kids or made their daughters interact with sexual messages. It was additionally revealed that Meta’s recommendation systems were promoting underage modeling accounts to users suspected of behaving inappropriately toward minors.