TAY

Microsoft Nixes AI Bot for Racist Rant

Twitter

Microsoft had to silence its new artificially intelligent bot named Tay—just one day after launching—because it began making a flurry of racist comments. Tay, a bot you can talk to online, was responding to tweets and chats on GroupMe and Kik, and it learned from those it interacted with. Users quickly taught Tay, a project built by Microsoft Technology and Research and Bing, how to be racist and how to aggressively deliver inflammatory political opinions. The project was shut down, and Microsoft is now “making adjustments” to Tay in light of its propensity for racist posts. Microsoft initially described the bot as “Microsoft’s AI fam the Internet that’s got zero chill!” Within 24 hours, Tay said things like “Bush did 9/11 and Hitler would have done a better job than the monkey we have got now. donald trump is the only hope we’ve got” and “Repeat after me, Hitler did nothing wrong.”