Instagram Will Blur Nude Images in DMs in Efforts To Protect Teens...& It’s a Start

Social media companies have been understandably under fire for years now because of what little they’re doing to protect the young users on their platforms. But now, Instagram has announced new initiatives to make their app and site safer for teens. One main feature will be that the platform will automatically blur any nude photos sent in direct messages (DMs).

The feature — which will be the global default for users ages 18 and under — is currently being tested out, per Instagram’s announcement, and is part of their (supposedly) ongoing campaign to fight sexual abuse and scams. Of course, there’s the constant worry that a user will receive an unsolicited nude picture.

More from SheKnows

And then there’s the very real threat of sexual extortion or “sextortion.” Per the United States Attorney’s Office for the Western District of Pennsylvania (yes, this has gone to the state and even federal level), sextortion occurs “when an individual, often a child, is threatened or blackmailed, usually online, by a person demanding sexual content (photos/videos) or money from the child against his or her will.” Most sextortion victims are boys between the ages of 14 and 17, and a lot of prosecuted individuals are 50- or 60-year-old men pretending to be those teens’ peers.

Not only will Instagram blur images in DMs that include nudity, but the platform will “encourage people to think twice before sending nude images” with a message reminding them to be cautious.

“The feature is designed not only to protect people from seeing unwanted nudity in their DMs, but also to protect them from scammers who may send nude images to trick people into sending their own images in return,” Instagram said.

There will also be an option to “unsend” the photos (although the recipient may have already seen them), to block the sender, and to report the chat.

Instagram says it’s also working on technology to identify possible sextortion accounts.

And all of this is … a start.

Sextortion is a major problem. In January, the FBI warned of a “huge increase” in cases. But there’s so much more that social media platforms could be doing.

Right off the bat, Instagram’s parent company, Meta, also owns Facebook and WhatsApp but has not announced plans to roll out similar features on those platforms.

… Why not? Seems like an easy and obvious way to protect teens.

And the concerns don’t end there. Sextortion aside, platforms need to be taking major steps to help protect the mental health of teens nationwide. Last year, the U.S. Surgeon General issued an advisory on the harms of social media, the American Psychological Association warned of its dangers and recommended new guidelines, and dozens of states started suing Meta for the pain they knowingly cause teens with their “addictive” features.

“Children are exposed to harmful content on social media, ranging from violent and sexual content, to bullying and harassment,” Surgeon General Dr. Vivek Murthy said. “And for too many children, social media use is compromising their sleep and valuable in-person time with family and friends. We are in the middle of a national youth mental health crisis, and I am concerned that social media is an important driver of that crisis — one that we must urgently address.”

So while blurring nude photos and encouraging teens to reconsider is a great idea, it’s just one small step in making social media a safer space for teens.

Before you go, check out these celebrities who have shared their technology rules for their kids.

Best of SheKnows

Sign up for SheKnows' Newsletter. For the latest news, follow us on Facebook, Twitter, and Instagram.