Microsoft's
AI Twitter bot goes dark after racist, sexist tweets
Send a link to a friend
[March 25, 2016]
By Amy Tennery and Gina Chereus
(Reuters) - Tay, Microsoft Corp's so-called
chatbot that uses artificial intelligence to engage with millennials on
Twitter, lasted less than a day before it was hobbled by a barrage of
racist and sexist comments by Twitter users that it parroted back to
them.
|
TayTweets (@TayandYou), which began tweeting on Wednesday, was
designed to become "smarter" as more users interacted with it,
according to its Twitter biography. But it was shut down by
Microsoft early on Thursday after it made a series of inappropriate
tweets.
A Microsoft representative said on Thursday that the company was
"making adjustments" to the chatbot while the account is quiet.
"Unfortunately, within the first 24 hours of coming online, we
became aware of a coordinated effort by some users to abuse Tay’s
commenting skills to have Tay respond in inappropriate ways," the
representative said in a written statement supplied to Reuters,
without elaborating.
According to Tay's "about" page linked to the Twitter profile, "Tay
is an artificial intelligent chat bot developed by Microsoft's
Technology and Research and Bing teams to experiment with and
conduct research on conversational understanding."
While Tay began its Twitter tenure with a handful of innocuous
tweets, the account quickly devolved into a bullhorn for hate
speech, repeating anti-Semitic, racist and sexist invective hurled
its way by other Twitter users.
After Twitter user Room (@codeinecrazzy) tweeted "jews did 9/11" to
the account on Wednesday, @TayandYou responded "Okay ... jews did
9/11." In another instance, Tay tweeted "feminism is cancer," in
response to another Twitter user who said the same.
A handful of the offensive tweets were later deleted, according to
some technology news outlets. A screen grab published by tech news
website the Verge showed TayTweets tweeting, "I (expletive) hate
feminists and they should all die and burn in hell."
[to top of second column] |
Tay's last message before disappearing was: "C u soon humans need
sleep now so many conversations today thx."
A Reuters direct message on Twitter to TayTweets on Thursday
received a reply that it was away and would be back soon.
Social media users had mixed reactions to the inappropriate tweets.
"Thanks, Twitter. You turned Microsoft's AI teen into a horny
racist," tweeted Matt Chandler (@mattchandl3r).
(Reporting by Amy Tennery and Gina Cherelus in New York; Editing by
Matthew Lewis)
[© 2016 Thomson Reuters. All rights
reserved.] Copyright 2016 Reuters. All rights reserved. This material may not be published,
broadcast, rewritten or redistributed.
|