Meta faces EU investigation over child safety risks

Send a link to a friend  Share

[May 16, 2024]  By Foo Yun Chee
 
 BRUSSELS (Reuters) - Meta Platforms' social media sites Facebook and Instagram will be investigated for potential breaches of EU online content rules relating to child safety, EU regulators said on Thursday, a move that could lead to hefty fines. 

EU flag and Meta logo are seen in this illustration taken, May 22, 2023. REUTERS/Dado Ruvic/Illustration/File Photo

Tech companies are required to do more to tackle illegal and harmful content on their platforms under the European Union's landmark Digital Services Act (DSA), which kicked in last year.

The European Commission said it had decided to open an in-depth investigation into Facebook and Instagram due to concerns they had not adequately addressed risks to children. Meta submitted a risk assessment report in September.

"The Commission is concerned that the systems of both Facebook and Instagram, including their algorithms, may stimulate behavioural addictions in children, as well as create so-called 'rabbit-hole effects'," the EU executive said in a statement.

"In addition, the Commission is also concerned about age-assurance and verification methods put in place by Meta." The regulator's concerns relate to children accessing inappropriate content.

Meta is already in the EU's crosshairs over election disinformation, a key concern ahead of crucial European Parliament elections next month. DSA violations can lead to fines of as much as 6% of a company's annual global turnover.

(Reporting by Foo Yun Chee; Editing by Mark Potter)

[© 2024 Thomson Reuters. All rights reserved.]
This material may not be published, broadcast, rewritten or redistributed.  Thompson Reuters is solely responsible for this content.

 

 

Back to top