An AI chatbot pushed a teen to kill himself, a lawsuit against its
creator alleges
Send a link to a friend
[October 26, 2024]
By KATE PAYNE
TALLAHASSEE, Fla. (AP) — In the final moments before he took his own
life, 14-year-old Sewell Setzer III took out his phone and messaged the
chatbot that had become his closest friend.
For months, Sewell had become increasingly isolated from his real life
as he engaged in highly sexualized conversations with the bot, according
to a wrongful death lawsuit filed in a federal court in Orlando this
week.
The legal filing states that the teen openly discussed his suicidal
thoughts and shared his wishes for a pain-free death with the bot, named
after the fictional character Daenerys Targaryen from the television
show “Game of Thrones.”
___
EDITOR’S NOTE — This story includes discussion of suicide. If you or
someone you know needs help, the national suicide and crisis lifeline in
the U.S. is available by calling or texting 988.
___
On Feb. 28, Sewell told the bot he was ‘coming home’ — and it encouraged
him to do so, the lawsuit says.
“I promise I will come home to you. I love you so much, Dany,” Sewell
told the chatbot.
“I love you too,” the bot replied. “Please come home to me as soon as
possible, my love.”
“What if I told you I could come home right now?” he asked.
“Please do, my sweet king,” the bot messaged back.
Just seconds after the Character.AI bot told him to “come home," the
teen shot himself, according to the lawsuit, filed this week by Sewell’s
mother, Megan Garcia, of Orlando, against Character Technologies Inc.
Character Technologies is the company behind Character.AI, an app that
allows users to create customizable characters or interact with those
generated by others, spanning experiences from imaginative play to mock
job interviews. The company says the artificial personas are designed to
“feel alive” and “human-like.”
“Imagine speaking to super intelligent and life-like chat bot Characters
that hear you, understand you and remember you,” reads a description for
the app on Google Play. “We encourage you to push the frontier of what’s
possible with this innovative technology.”
Garcia's attorneys allege the company engineered a highly addictive and
dangerous product targeted specifically to kids, “actively exploiting
and abusing those children as a matter of product design,” and pulling
Sewell into an emotionally and sexually abusive relationship that led to
his suicide.
“We believe that if Sewell Setzer had not been on Character.AI, he would
be alive today,” said Matthew Bergman, founder of the Social Media
Victims Law Center, which is representing Garcia.
A spokesperson for Character.AI said Friday that the company doesn't
comment on pending litigation. In a blog post published the day the
lawsuit was filed, the platform announced new “community safety
updates,” including guardrails for children and suicide prevention
resources.
“We are creating a different experience for users under 18 that includes
a more stringent model to reduce the likelihood of encountering
sensitive or suggestive content,” the company said in a statement to The
Associated Press. “We are working quickly to implement those changes for
younger users.”
[to top of second column]
|
In this undated photo provided by Megan Garcia of Florida in October
2024, she stands with her son, Sewell Setzer III. (Megan Garcia via
AP)
Google and its parent company, Alphabet, have also been named as
defendants in the lawsuit. According to legal filings, the founders
of Character.AI are former Google employees who were “instrumental”
in AI development at the company, but left to launch their own
startup to “maximally accelerate” the technology.
In August, Google struck a $2.7 billion deal with Character.AI to
license the company's technology and rehire the startup's founders,
the lawsuit claims. The AP left multiple email messages with Google
and Alphabet on Friday.
In the months leading up to his death, Garcia's lawsuit says, Sewell
felt he had fallen in love with the bot.
While unhealthy attachments to AI chatbots can cause problems for
adults, for young people it can be even riskier — as with social
media — because their brain is not fully developed when it comes to
things such as impulse control and understanding the consequences of
their actions, experts say.
Youth mental health has reached crisis levels in recent years,
according to U.S. Surgeon General Vivek Murthy, who has warned of
the serious health risks of social disconnection and isolation —
trends he says are made worse by young people's near universal use
of social media.
Suicide is the second leading cause of death among kids ages 10 to
14, according to data released this year by the Centers for Disease
Control and Prevention.
James Steyer, the founder and CEO of the nonprofit Common Sense
Media, said the lawsuit “underscores the growing influence — and
severe harm — that generative AI chatbot companions can have on the
lives of young people when there are no guardrails in place.”
Kids’ overreliance on AI companions, he added, can have significant
effects on grades, friends, sleep and stress, “all the way up to the
extreme tragedy in this case.”
“This lawsuit serves as a wake-up call for parents, who should be
vigilant about how their children interact with these technologies,”
Steyer said.
Common Sense Media, which issues guides for parents and educators on
responsible technology use, says it is critical that parents talk
openly to their kids about the risks of AI chatbots and monitor
their interactions.
“Chatbots are not licensed therapists or best friends, even though
that’s how they are packaged and marketed, and parents should be
cautious of letting their children place too much trust in them,”
Steyer said.
___
Associated Press reporter Barbara Ortutay in San Francisco
contributed to this report. Kate Payne is a corps member for The
Associated Press/Report for America Statehouse News Initiative.
Report for America is a nonprofit national service program that
places journalists in local newsrooms to report on undercovered
issues.
All contents © copyright 2024 Associated Press. All rights reserved |