Meta's Instagram linked to depression, anxiety, insomnia in kids - US
states' lawsuit
Send a link to a friend
[October 25, 2023]
By Jonathan Stempel, Diane Bartz and Nate Raymond
(Reuters) - Dozens of U.S. states are suing Meta Platforms and its
Instagram unit, accusing them of fueling a youth mental health crisis by
making their social media platforms addictive.
In a complaint filed on Tuesday, the attorneys general of 33 states
including California and New York said Meta, which also operates
Facebook, repeatedly misled the public about the dangers of its
platforms, and knowingly induced young children and teenagers into
addictive and compulsive social media use.
"Meta has harnessed powerful and unprecedented technologies to entice,
engage, and ultimately ensnare youth and teens," according to the
complaint filed in the Oakland, California federal court. "Its motive is
profit."
Children have long been an appealing demographic for businesses, which
hope to attract them as consumers at ages when they may be more
impressionable, and solidify brand loyalty.
For Meta, younger consumers may help secure more advertisers who hope
children will keep buying their products as they grow up.
But the states said research has associated children's use of Meta's
social media platforms with "depression, anxiety, insomnia, interference
with education and daily life, and many other negative outcomes."
Meta said it was "disappointed" in the lawsuit.
"Instead of working productively with companies across the industry to
create clear, age-appropriate standards for the many apps teens use, the
attorneys general have chosen this path," the company said.
Eight other U.S. states and Washington, D.C. are filing similar lawsuits
against Meta on Tuesday, bringing the total number of authorities taking
action against the Menlo Park, California-based company to 42.
Meta shares fell 0.6% on the Nasdaq.
TIKTOK, YOUTUBE ALREADY FACE LAWSUITS
The cases are the latest in a string of legal actions against social
media companies on behalf of children and teens.
Meta, ByteDance's TikTok and Google's YouTube already face hundreds of
lawsuits filed on behalf of children and school districts about the
addictiveness of social media.
Mark Zuckerberg, Meta's chief executive, has defended in the past his
company's handling of content that some critics find harmful.
"At the heart of these accusations is this idea that we prioritize
profit over safety and well-being. That's just not true," he posted in
October 2021 on his Facebook page.
In Tuesday's cases, Meta could face civil penalties of $1,000 to $50,000
for each violation of various state laws -- an amount that could add up
quickly given the millions of young children and teenagers who use
Instagram.
[to top of second column]
|
Children playground miniatures are seen in front of displayed
Instagram logo in this illustration taken April 4, 2023.
REUTERS/Dado Ruvic/Illustration/File Photo
Much of the focus on Meta stemmed from a whistleblower's release of
documents in 2021 that showed the company knew Instagram, which
began as a photo-sharing app, was addictive and worsened body image
issues for some teen girls.
The lawsuit by the 33 states alleged that Meta has strived to ensure
that young people spend as much time as possible on social media
despite knowing that they are susceptible to the need for approval
in the form of "likes" from other users about their content.
"Meta has been harming our children and teens, cultivating addiction
to boost corporate profits," said California Attorney General Rob
Bonta, whose state includes Meta's headquarters.
'THREATS THAT WE CAN'T IGNORE'
States also accused Meta of violating a law banning the collection
of data of children under age 13, and deceptively denying that its
social media was harmful.
"Meta did not disclose that its algorithms were designed to
capitalize on young users' dopamine responses and create an
addictive cycle of engagement," the complaint said.
Dopamine is a type of neurotransmitter that plays a role in feelings
of pleasure.
According to the complaint, Meta's refusal to accept responsibility
extended last year to its distancing itself from a 14-year-old
girl's suicide in the UK, after she was exposed on Instagram to
content about suicide and self-injury.
A coroner rejected a Meta executive's claim that such content was
"safe" for children, finding that the girl likely binged on harmful
content that normalized the depression she had felt before killing
herself.
States also alleged Meta is seeking to expand its harmful practices
into virtual reality, including its Horizon Worlds platform and the
WhatsApp and Messenger apps.
By suing, authorities are seeking to patch holes left by the U.S.
Congress' inability to pass new online protections for children
despite years of discussions.
Colorado Attorney General Philip Weiser said the whistleblower's
revelations showed that Meta knew how Facebook and Instagram were
harming children.
"It is very clear that decisions made by social media platforms,
like Meta, are part of what is driving mental health harms, physical
health harms, and threats that we can't ignore," he said.
(Reporting by Jonathan Stempel in New York, Diane Bartz and David
Shepardson in Washington, D.C., and Nate Raymond in Boston; Editing
by Chris Sanders, Rod Nickel and Lisa Shumaker)
[© 2023 Thomson Reuters. All rights
reserved.]This material
may not be published, broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content. |