Meta faces EU investigation over child safety risks

FILE PHOTO: Illustration shows EU flag and Meta logo

By Foo Yun Chee

BRUSSELS (Reuters) -Meta Platforms' social media sites Facebook and Instagram will be investigated for potential breaches of EU online content rules relating to child safety, EU regulators said on Thursday, a move that could lead to hefty fines.

Tech companies are required to do more to tackle illegal and harmful content on their platforms under the European Union's landmark Digital Services Act (DSA), which kicked in last year.

The European Commission said it had decided to open an in-depth investigation into Facebook and Instagram due to concerns they had not adequately addressed risks to children. Meta submitted a risk assessment report in September.

"The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects'," the EU executive said in a statement.

"In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta." The regulator's concerns relate to children accessing inappropriate content.

Meta said it already has a number of online tools to protect children.

"We want young people to have safe, age-appropriate experiences online and have spent a decade developing more than 50 tools and policies designed to protect them," a Meta spokesperson said.

"This is a challenge the whole industry is facing, and we look forward to sharing details of our work with the European Commission."

Meta is already in the EU's crosshairs over election disinformation, a key concern ahead of crucial European Parliament elections next month. DSA violations can lead to fines of as much as 6% of a company's annual global turnover.

(Reporting by Foo Yun CheeEditing by Mark Potter, Kirsten Donovan)