Home
Digiwonk

It Only Took Twitter 16 Hours to Make This Chatbot Racist

Mar 24, 2016 10:54 PM
635944312677667725.jpg

On Wednesday, Microsoft launched a chatbot on social media that learns from conversation, and trolls quickly taught it how to be a shameless, Nazi-loving bigot. Oh Internet, an opportunity for a Hitler joke never slips past you, does it?

635944305430323618.jpg

This Is Why We Can't Have Nice Things

Tay.ai is a chatbot created by Microsoft's Technology and Research division to study "conversational understanding" for artificial intelligence, and they decided the best way to do that was to release it onto social media, targeting the 18-24 year old demographic. Described by the company as having "no chill," even from launch it would say grating things such as "tanx! may allah bless u to!" after being asked how fast it could run a 5K.

Twitter users noticed and started gaming the system. Tay learns from the conversations it has, gathering more words and phrases into its language like a child would. And like a child, it doesn't understand context.

By flooding Tay with racy comments and jokes, it learned those phrases and began mouthing them off like a kid who just learned how to say "fuck."

Warning, these are definitely NSFW, and very, very crude:

635944171057202777.jpg
635944171411674547.jpg
635944171964368333.jpg
635944174607667298.jpg
635944175464684703.jpg
635944175516264923.jpg
635944171057202777.jpg
635944171411674547.jpg
635944171964368333.jpg
635944174607667298.jpg
635944175464684703.jpg
635944175516264923.jpg

After just 16 hours, Tay was shut off, sending a final tweet saying "c u soon humans need sleep now so many conversations today thx" and hasn't been back yet. Microsoft issued a statement saying they've taken Tay offline to make adjustments:

"The AI chatbot Tay is a machine learning project, designed for human engagement. It is as much a social and cultural experiment, as it is technical. Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay's commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments."

The catalyst for Tay's bigotry may have been the improvisational comedians hired to aid development, giving it an off the cuff humor that went too sardonic. The act of hiring comedians hints that Tay may have been more of a marketing ploy to mine data than an honest attempt at advancing an understanding of machine learning. Getting chatbots to say horrible things is cemented in the history of the internet, making it bizarre that Microsoft would even attempt something so public that's targeted at the age group most likely to get it to say that the Holocaust was made up.

Images courtesy of socialhax

Comments

No Comments Exist

Be the first, drop a comment!