TikTok Revealed New Mental Health Features for Users Amid Recent Instagram Backlash

Tiktok-Expands-Mental-Health-Options-As-Instagram-Is-Under-Fire-GettyImages-1156015444
Tiktok-Expands-Mental-Health-Options-As-Instagram-Is-Under-Fire-GettyImages-1156015444

Getty Images

TikTok is doubling down on its commitment to the emotional well-being of its online community. This week, the viral video app (which is owned by the company ByteDance) announced that it will offer new resources to support users potentially struggling with mental health issues, including eating disorders and or thoughts of suicide.

The platform announced the rollout of these "well-being guides" in a statement written Tuesday by Tara Wadhwa, TikTok's director of policy in the U.S. "While we don't allow content that promotes, glorifies or normalizes suicide, self-harm or eating disorders, we do support people who choose to share their experiences to raise awareness, help others who might be struggling and find support among our community," reads the statement.

In regard to eating disorders, TikTok now offers teens, caregivers, and educators a "Safety Center guide," which was developed in collaboration with experts, including those at the National Eating Disorders Association. The Safety Guide offers information, advice, and support, and is currently available online, listed under the Safety Center on the app. It can also connect you to professional resources based on your geographic location. For instance, if you identify the U.S. as your region, you will be provided a phone number for NEDA, as well as its hours of operation. (Related: How Instagram Is Supporting People with Eating Disorders and Body Images Issues)

The platform is also implementing search intervention, meaning, when someone searches TikTok for phrases such as "#suicide," they will be directed to local support resources such as the Crisis Text Line helpline, which provides text-based mental health support around the clock. With the Crisis Text Line, they will be able to find support and information in regard to possible treatment options.

TikTok has also provided its community with content from users, "where they share their personal experiences with mental well-being, information on where to seek support, and advice on how to talk to loved ones about these issues," all of which was shared after consulting with independent experts, according to Wadhwa's statement. These videos will appear in search results for certain phrases related to self-harm or suicide, and users can opt-in to view the content if they would like.

"We're proud that our platform has become a place where people can share their personal experiences with mental well-being, find community and support each other, and we take very seriously our responsibility to keep TikTok a safe space for these important conversations," said Wadhwa on Tuesday.

The news of TikTok's latest initiative comes on the heels of a scathing report from The Wall Street Journal that alleged Facebook, the social media giant that owns Instagram, has continuously found that the app can be especially toxic for teen girls. The report (which was released Tuesday) cites studies conducted by Facebook over the past several years and details how the company's researchers repeatedly found Instagram to be at the root of many of the participants' mental health struggles (e.g. body image).

"Thirty-two percent of teen girls said that when they felt bad about their bodies, Instagram made them feel worse," shared the researchers in March 2020 slide presentation posted to Facebook's internal message board, according to the WSJ. Another slide from 2019 read, "We make body image issues worse for one in three teen girls," reported the WSJ. One Facebook presentation is also said to have noted that, among teenagers who have had suicidal thoughts, 6 percent of American users and 13 percent of British users traced the issue to Instagram, according to the WSJ.

While these statistics aren't comforting, it's not exactly new; it's no secret that social media can be hazardous to your mental health, and Instagram, in particular, has been named "the worst" before. Since social media has become an integral part of pop culture — and shows no signs of disappearing — it's worth asking what we can do to make the mental health impacts less detrimental. Kudos to TikTok for taking a step to make its global community a safer space.

If you're struggling with thoughts of suicide or have felt deeply distressed for a period of time, call the National Suicide Prevention Lifeline at 1-800-273-TALK (8255) to speak with someone who will provide free and confidential support 24 hours a day, seven days a week.

And if you or someone you know is at risk of or is experiencing an eating disorder, resources are available online from the National Eating Disorders Association or through the NEDA hotline at 800-931-2237.