New Mexico jury says Meta harms children's mental health and safety,
violating state law
[March 25, 2026] By
MORGAN LEE
SANTA FE, N.M. (AP) — A New Mexico jury determined Tuesday that Meta
knowingly harmed children’s mental health and concealed what it knew
about child sexual exploitation on its social media platforms, a verdict
that signals a changing tide against tech companies and the government's
willingness to crack down.
The landmark decision comes after a nearly seven-week trial, and as
jurors in a federal court in California have been sequestered in
deliberations for more than a week about whether Meta and YouTube should
be liable in a similar case.
New Mexico jurors sided with state prosecutors who argued that Meta —
which owns Instagram, Facebook and WhatsApp — prioritized profits over
safety, and violated parts of the state’s Unfair Practices Act.
The jury agreed with allegations that Meta made false or misleading
statements and also agreed that Meta engaged in “unconscionable” trade
practices that unfairly took advantage of the vulnerabilities of and
inexperience of children.
How much Meta owes
Jurors found there were thousands of violations, each counting
separately toward a penalty of $375 million. That's less than one-fifth
of what prosecutors were seeking.
Meta is valued at about $1.5 trillion and the company's stock was up 5%
in early after-hours trading following the verdict, a signal that
shareholders were shrugging off the news.
Juror Linda Payton, 38, said the jury reached a compromise on the
estimated number of teenagers affected by Meta’s platforms, while opting
for the maximum penalty per violation. With a maximum $5,000 penalty for
each violation, she said she thought each child was worth the maximum
amount.

What will change on Meta's platforms
The social media conglomerate won’t be forced to change its practices
right away. It will be up to a judge — not a jury — to determine whether
Meta's social media platforms created a public nuisance and whether the
company should pay for public programs to address the harms. That second
phase of the trial will happen in May.
A Meta spokesperson said the company disagrees with the verdict and will
appeal.
“We work hard to keep people safe on our platforms and are clear about
the challenges of identifying and removing bad actors or harmful
content,” the spokesperson said. "We will continue to defend ourselves
vigorously, and we remain confident in our record of protecting teens
online.”
Attorneys for Meta said the company discloses risks and makes efforts to
weed out harmful content and experiences, while acknowledging that some
bad material gets through its safety net.
Other lawsuits against Meta
New Mexico’s case was among the first to reach trial in a wave of
litigation involving social media platforms and their impacts on
children.
More than 40 state attorneys general have filed lawsuits against Meta,
claiming it’s contributing to a mental health crisis among young people
by deliberately designing Instagram and Facebook features that are
addictive.
“Meta’s house of cards is beginning to fall,” said Sacha Haworth,
executive director of watchdog group The Tech Oversight Project. “For
years, it’s been glaringly obvious that Meta has failed to stop sexual
predators from turning online interactions into real world harm."

[to top of second column] |

Meta attorney Kevin Huff makes closing arguments, Monday, March 23,
2026, in state court, in Santa Fe, N.M., in a trial where the social
media conglomerate is accused of misleading its users about how safe
its platforms are for children. (Eddie Moore/The Albuquerque Journal
via AP, Pool)
 Haworth pointed to whistleblowers
like Arturo Béjar, as well as unsealed documents and other evidence,
saying it painted a damning picture.
New Mexico’s case relied on an undercover investigation where agents
created social media accounts posing as children to document sexual
solicitations and Meta’s response.
The lawsuit, filed in 2023 by New Mexico Attorney General Raúl
Torrez, also said Meta hasn’t fully disclosed or addressed the
dangers of social media addiction. Meta hasn’t agreed that social
media addiction exists, but executives at trial acknowledged
“problematic use” and say they want people to feel good about the
time they spend on Meta’s platforms.
“Evidence shows not only that Meta invests in safety because it’s
the right thing to do but because it is good for business,” Meta
attorney Kevin Huff told jurors in closing arguments. “Meta designs
its apps to help people connect with friends and family, not to try
to connect predators.”
Tech companies have been protected from liability for content posted
on their social media platforms under Section 230, a 30-year-old
provision of the U.S. Communications Decency Act, as well as a First
Amendment shield.
New Mexico prosecutors say Meta still should be responsible for its
role in pushing out that content through complex algorithms that
proliferate material that is harmful for children.
“We know the output is meant to be engagement and time spent for
kids,” prosecution attorney Linda Singer said. “That choice that
Meta made has profound negative impacts on kids.”
What the New Mexico jury reviewed
The New Mexico trial examined a raft of Meta’s internal
correspondence and reports related to child safety. Jurors also
heard testimony from Meta executives, platform engineers,
whistleblowers who left the company, psychiatric experts and tech
safety consultants.
The jury also heard testimony from local public school educators who
struggled with disruptions linked to social media, including
sextortion schemes targeting children.

In reaching a verdict, the jury considered whether social media
users were misled by specific statements about platform safety by
Meta CEO Mark Zuckerberg, Instagram head Adam Mosseri and Meta
global head of safety Antigone Davis.
Jurors also considered Meta's failure to enforce its ban on users
under 13, the role of its algorithms in prioritizing sensational or
harmful content, and the prevalence of social media content about
teen suicide.
ParentsSOS, a coalition of families who have lost children to harm
caused by social media, called the verdict a “watershed moment.”
“We parents who have experienced the unimaginable — the death of a
child because of social media harms — applaud this rare and
momentous milestone in the years-long fight to hold Big Tech
accountable for the dangers their products pose to our kids,” the
group said in a statement.
___
Associated Press writer Barbara Ortutay in San Francisco contributed
to this report.
All contents © copyright 2026 Associated Press. All rights reserved |