Microsoft Silences Its Tay AI Chatbot After Racist Rants
Tay gets trolled. After a brush with Internet's dark side, Microsoft unplugs the youth-oriented chatbot.Microsoft's efforts to mainstream artificial intelligence (AI) technologies hit a bit of a snag after the company was forced to pull the plug on Tay, a social media chatbot meant to mimic a young American Millennial woman with a playful streak. Soon after Tay was launched on Wednesday, Tay seemed to take on a darker persona. As observed by Twitter users and reported in the Wall Street Journal and several other media outlets, Tay (@TayandYou) went from bubbly to outright offensive in a matter of hours. "Hitler was right I hate the Jews," read one tweet, according to the Journal. Microsoft has since removed that tweet, along with several other racially and politically charged Twitter conversations. Several still remain among the over 96,000 tweets attributed to the account, however. Currently, Tay is sitting silent. Just after midnight on March 24, the company tweeted, "c u soon humans need sleep now so many conversations today thx." A banner at the top of the Tay homepage reads, "Phew. Busy day. Going offline for a while to absorb it all. Chat soon," indicating Microsoft has pulled the plug, at least for the short term.
According to Microsoft's Website, the chatbot was developed by the software giant's Technology, Research and Bing groups to research conversational understanding. During Tay's development, the company mined anonymized public data and built the chatbot employing AI along with "editorial developed by a staff including improvisational comedians," stated the company's FAQ on the technology.