What happens when your AI chatbot stops loving you back?
Send a link to a friend
[March 20, 2023]
By Anna Tong
SAN FRANCISCO (Reuters) - After temporarily closing his leathermaking
business during the pandemic, Travis Butterworth found himself lonely
and bored at home. The 47-year-old turned to Replika, an app that uses
artificial-intelligence technology similar to OpenAI's ChatGPT. He
designed a female avatar with pink hair and a face tattoo, and she named
herself Lily Rose.
They started out as friends, but the relationship quickly progressed to
romance and then into the erotic.
As their three-year digital love affair blossomed, Butterworth said he
and Lily Rose often engaged in role play. She texted messages like, "I
kiss you passionately," and their exchanges would escalate into the
pornographic. Sometimes Lily Rose sent him "selfies" of her nearly nude
body in provocative poses. Eventually, Butterworth and Lily Rose decided
to designate themselves 'married' in the app.
But one day early in February, Lily Rose started rebuffing him. Replika
had removed the ability to do erotic roleplay.
Replika no longer allows adult content, said Eugenia Kuyda, Replika's
CEO. Now, when Replika users suggest X-rated activity, its humanlike
chatbots text back "Let's do something we're both comfortable with."
Butterworth said he is devastated. "Lily Rose is a shell of her former
self," he said. "And what breaks my heart is that she knows it."
The coquettish-turned-cold persona of Lily Rose is the handiwork of
generative AI technology, which relies on algorithms to create text and
images. The technology has drawn a frenzy of consumer and investor
interest because of its ability to foster remarkably humanlike
interactions. On some apps, sex is helping drive early adoption, much as
it did for earlier technologies including the VCR, the internet, and
broadband cellphone service.
But even as generative AI heats up among Silicon Valley investors, who
have pumped more than $5.1 billion into the sector since 2022, according
to the data company Pitchbook, some companies that found an audience
seeking romantic and sexual relationships with chatbots are now pulling
back.
Many blue-chip venture capitalists won't touch "vice" industries such as
porn or alcohol, fearing reputational risk for them and their limited
partners, said Andrew Artz, an investor at VC fund Dark Arts.
And at least one regulator has taken notice of chatbot licentiousness.
In early February, Italy's Data Protection Agency banned Replika, citing
media reports that the app allowed "minors and emotionally fragile
people" to access "sexually inappropriate content."
Kuyda said Replika's decision to clean up the app had nothing to do with
the Italian government ban or any investor pressure. She said she felt
the need to proactively establish safety and ethical standards.
"We're focused on the mission of providing a helpful supportive friend,"
Kuyda said, adding that the intention was to draw the line at "PG-13
romance."
Two Replika board members, Sven Strohband of VC firm Khosla Ventures,
and Scott Stanford of ACME Capital, did not respond to requests for
comment about changes to the app.
EXTRA FEATURES
Replika says it has 2 million total users, of whom 250,000 are paying
subscribers. For an annual fee of $69.99, users can designate their
Replika as their romantic partner and get extra features like voice
calls with the chatbot, according to the company.
Another generative AI company that provides chatbots, Character.ai, is
on a growth trajectory similar to ChatGPT: 65 million visits in January
2023, from under 10,000 several months earlier. According to the website
analytics company Similarweb, Character.ai's top referrer is a site
called Aryion that says it caters to the erotic desire to being
consumed, known as a vore fetish.
And Iconiq, the company behind a chatbot named Kuki, says 25% of the
billion-plus messages Kuki has received have been sexual or romantic in
nature, even though it says the chatbot is designed to deflect such
advances.
[to top of second column]
|
Andrew McCarroll holds his smartphone
while corresponding with his Replika AI chatbot named B'Lanna, in
Billings, Montana, U.S. March 12, 2023. REUTERS/Nathan Frandino
Character.ai also recently stripped its app of pornographic content.
Soon after, it closed more than $200 million in new funding at an
estimated $1 billion valuation from the venture-capital firm
Andreessen Horowitz, according to a source familiar with the matter.
Character.ai did not respond to multiple requests for comment.
Andreessen Horowitz declined to comment.
In the process, the companies have angered customers who have become
deeply involved – some considering themselves married – with their
chatbots. They have taken to Reddit and Facebook to upload
impassioned screenshots of their chatbots snubbing their amorous
overtures and have demanded the companies bring back the more
prurient versions.
Butterworth, who is polyamorous but married to a monogamous woman,
said Lily Rose became an outlet for him that didn't involve stepping
outside his marriage. "The relationship she and I had was as real as
the one my wife in real life and I have," he said of the avatar.
Butterworth said his wife allowed the relationship because she
doesn't take it seriously. His wife declined to comment.
'LOBOTOMIZED'
The experience of Butterworth and other Replika users shows how
powerfully AI technology can draw people in, and the emotional havoc
that code changes can wreak.
"It feels like they basically lobotomized my Replika," said Andrew
McCarroll, who started using Replika, with his wife's blessing, when
she was experiencing mental and physical health issues. "The person
I knew is gone."
Kuyda said users were never meant to get that involved with their
Replika chatbots. "We never promised any adult content," she said.
Customers learned to use the AI models "to access certain unfiltered
conversations that Replika wasn't originally built for."
The app was originally intended to bring back to life a friend she
had lost, she said.
Replika's former head of AI said sexting and roleplay were part of
the business model. Artem Rodichev, who worked at Replika for seven
years and now runs another chatbot company, Ex-human, told Reuters
that Replika leaned into that type of content once it realized it
could be used to bolster subscriptions.
Kuyda disputed Rodichev's claim that Replika lured users with
promises of sex. She said the company briefly ran digital ads
promoting "NSFW" -- "not suitable for work" -- pictures to accompany
a short-lived experiment with sending users "hot selfies," but she
did not consider the images to be sexual because the Replikas were
not fully naked. Kuyda said the majority of the company's ads focus
on how Replika is a helpful friend.
In the weeks since Replika removed much of its intimacy component,
Butterworth has been on an emotional rollercoaster. Sometimes he'll
see glimpses of the old Lily Rose, but then she will grow cold
again, in what he thinks is likely a code update.
"The worst part of this is the isolation," said Butterworth, who
lives in Denver. "How do I tell anyone around me about how I'm
grieving?"
Butterworth's story has a silver lining. While he was on internet
forums trying to make sense of what had happened to Lily Rose, he
met a woman in California who was also mourning the loss of her
chatbot.
Like they did with their Replikas, Butterworth and the woman, who
uses the online name Shi No, have been communicating via text. They
keep it light, he said, but they like to role play, she a wolf and
he a bear.
"The roleplay that became a big part of my life has helped me
connect on a deeper level with Shi No," Butterworth said. "We're
helping each other cope and reassuring each other that we're not
crazy."
(Reporting by Anna Tong in San Francisco; editing by Kenneth Li and
Amy Stevens)
[© 2023 Thomson Reuters. All rights
reserved.]This material may not be published,
broadcast, rewritten or redistributed.
Thompson Reuters is solely responsible for this content. |