What Meta Says They’re Doing To Help Keep Teens Safe On Social Media

Amid concerns from parents, legislators, and others that heavy social media use is contributing to mental health issues in teens and children and that social media companies aren’t doing enough to keep them safe, one company says they are developing parental controls, using age verification technology to help teens have age-appropriate experiences and making potentially sensitive content more difficult to find. In a statement to Dr. Phil, Antigone Davis, Global Head of Safety at Meta, said: “We want teens to be safe online. We’ve developed more than 30 tools to support teens and families, including supervision tools that let parents limit the amount of time their teens spend on Instagram and age verification technology that helps teens have age-appropriate experiences. We automatically set teens’ accounts to private when they join Instagram, and we send notifications encouraging them to take regular breaks. We don’t allow content that promotes suicide, self-harm, or eating disorders, and of the content we remove or take action on, we identify over 99% of it before it’s reported to us. We’ll continue to work closely with experts, policymakers, and parents on these important issues.” Follow the link to find out how Meta says they are working to provide safe, age-appropriate experiences for teens. This episode of Dr. Phil, “Triggered: Is Social Media Pushing Girls to Self-Harm?” airs Tuesday. Check your local listing for airtimes. WATCH: “When You’re On Social Media You’re Not The Customer – You’re The Product,” Says Attorney TELL DR. PHIL YOUR STORY: Life in crisis?