The head of Instagram called for the creation of an industry body to develop best practices for protecting youngsters online during his first appearance before Congress, as Big Tech faces blowback from lawmakers over tech's harms to children.
Why it matters: Republicans and Democrats have found common ground in grilling tech companies on how their products harm children, especially after revelations in The Wall Street Journal about Instagram's potential harm to the mental health of teen girls.
Get market news worthy of your time with Axios Markets. Subscribe for free.
Driving the news: In opening remarks before the Senate Commerce consumer protection subcommittee, Instagram head Adam Mosseri highlighted research that found more U.S. teens are using TikTok and YouTube than Instagram as he pressed for industry-wide solutions.
He suggested an "an industry body" that will determine best practices when it comes to safety concerns including how to verify the age of users, how to build age-appropriate experiences and how to build parental controls.
"This is an industry-wide challenge that requires industry-wide solutions and industry-wide standards," Mosseri told lawmakers.
Ahead of the hearing, Mosseri announced changes Instagram is making to better protect young users, including launching the Take a Break option for a user that's been scrolling for a certain amount of time and building a feature that will nudge teens toward different topics if they've been dwelling on one.
The company also announced that it plans a March launch for tools parents can use to see — and limit— how much time their kids spend on Instagram.
The other side: The Senate subcommittee, led by chairman Richard Blumenthal (D-Conn.) and ranking member Marsha Blackburn (R-Tenn.), has delved into how online platforms can harm kids' mental health as part of a series of hearings this fall.
Blumenthal dismissed the changes Mosseri announced, calling the parental controls the "bare minimum" and saying there are no effective warnings or notices to parents when their children are spiraling into eating disorders, bullying or self-harm.
"Parents and children need more power, more effective tools to protect themselves on the platform," Blumenthal said at the hearing, promising legislative action.
The big picture: Protecting children online is the rare area where lawmakers have successfully passed legislation, albeit decades ago, through the 1998 Children's Online Privacy Protection Act, or COPPA.
That law requires websites to get parents' consent before collecting data about users under 13 years old. Critics say the Federal Trade Commission has failed to aggressively enforce the requirements and argue the rules should apply to teens as well.
"COPPA is still the only commercial online privacy law the United States has," Jeff Chester, executive director of the Center for Digital Democracy, told Axios. "Once you turn 13, in the United States you no longer have any protections."
Yes, but: Some tech companies argue that children are going to use their products anyway, so it's better if they offer services that are meant for children, such as the now-paused Instagram Kids idea.
YouTube, which faced an FTC fine for COPPA violations in 2019, has different services and products for younger users, including YouTube Kids, Made for Kids and Supervised Experiences.
Meanwhile, the Facebook research publicized by whistleblower Frances Haugen and real-world experience have demonstrated the limits of the COPPA rules.
The United Kingdom has developed its "Age Appropriate Design Code," a set of 15 standards online services must adhere to meant to protect the best interests of children online, such as setting the highest-privacy setting as default.
Lawmakers including Sen. Ed Markey (D-Mass.), who authored COPPA when he was in the House, have also introduced legislation to update the rules.
Reality check: Observers aren't expecting big revelations from the hearing, but rather for it to add momentum to the push for new legislation.
"The fundamental issues that Instagram and Facebook face relate to their business model," Jim Steyer, founder and CEO of Common Sense, told Axios. "And Adam Mosseri is not going to get up in front of the Senate and promise to change the business model."
The bottom line: "So far Congress has shown they're all bark and no bite when it comes to regulating the online industry, even for kids," the Center for Digital Democracy's Chester told Axios.
"If there's one opportunity for bipartisan legislation, it should be protecting kids and teens online."
Editor’s note: This post has been updated with new details throughout.
More from Axios: Sign up to get the latest market trends with Axios Markets. Subscribe for free