EU Investigates Meta Over Addictive Social Media Effects on Children

EU Investigates Meta Over Addictive Social Media Effects on Children

LONDON — European Union regulators Thursday opened investigations into American tech giant Meta for the potentially addictive effects Instagram and Facebook have on children, an action with far-reaching implications because it cuts to the core of how the company’s products are designed.

Meta’s products may “exploit the weaknesses and inexperience of minors” to create behavioral dependencies that threaten their mental well-being, the European Commission, the executive branch of the 27-member bloc, said in a statement. EU regulators could ultimately fine Meta up to 6% of its global revenue, which was $135 billion last year, as well as force other product changes.

The investigations are part of a growing effort by governments around the world to rein in services such as Instagram and TikTok to protect minors. Meta has for years faced criticism that its products and recommendation algorithms are fine-tuned to hook children. In October, three dozen states in the United States sued Meta for using “psychologically manipulative product features” to lure children, in violation of consumer protection laws.

Sign up for The Morning newsletter from the New York Times

EU regulators said they had been in touch with U.S. counterparts about the investigations announced Thursday. The regulators said Meta could be in violation of the Digital Services Act, a law approved in 2022 that requires large online services to more aggressively police their platforms for illicit content and have policies in place to mitigate risks toward children. People younger than 13 are not supposed to able to sign up for an account, but EU investigators said they would scrutinize the company’s age-verification tools as part of their investigation.

“We will now investigate in-depth the potential addictive and ‘rabbit hole’ effects of the platforms, the effectiveness of their age verification tools, and the level of privacy afforded to minors in the functioning of recommender systems,” Thierry Breton, the EU’s internal markets commissioner, who is overseeing the investigations, said in a statement. “We are sparing no effort to protect our children.”

On Thursday, Meta said its social media services were safe for young people, noting features that let parents and children set time limits on how much they use Instagram or Facebook. Teenagers are also defaulted into more restrictive content and recommendation settings. Advertisers are barred from showing targeted ads to underage users based on their activity on Meta’s apps.

“We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them,” Meta said in a statement. “This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission.”

EU officials did not give a timeline of how long the investigation would take. But the opening of a formal inquiry Thursday gives regulators wide authority to gather evidence from Meta, including sending legal requests for information, interviewing company executives and conducting inspections of corporate offices. Investigations of Instagram and Facebook will be conducted separately.

EU regulators have taken aim at a number of companies since the Digital Services Act took effect. Last month, TikTok suspended a version of its app in the EU after authorities raised questions about an “addictive” feature that lets users earn rewards such as gift cards for watching videos, liking content and following certain creators.

Meta is facing another investigation related to political advertising, while X, a social media site owned by Elon Musk, is facing an inquiry related to content moderation, risk management and advertising transparency.

c.2024 The New York Times Company