Tech companies are required to do more to tackle illegal and
harmful content on their platforms under the European Union's
landmark Digital Services Act (DSA), which kicked in last year.
The European Commission said it had decided to open an in-depth
investigation into Facebook and Instagram due to concerns they
had not adequately addressed risks to children. Meta submitted a
risk assessment report in September.
"The Commission is concerned that the systems of both Facebook
and Instagram, including their algorithms, may stimulate
behavioural addictions in children, as well as create so-called
'rabbit-hole effects'," the EU executive said in a statement.
"In addition, the Commission is also concerned about
age-assurance and verification methods put in place by Meta."
The regulator's concerns relate to children accessing
inappropriate content.
Meta is already in the EU's crosshairs over election
disinformation, a key concern ahead of crucial European
Parliament elections next month. DSA violations can lead to
fines of as much as 6% of a company's annual global turnover.
(Reporting by Foo Yun Chee; Editing by Mark Potter)
[© 2024 Thomson Reuters. All rights
reserved.]
This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content.
|
|