Microsoft temporarily kills AI chatbot Tay after it goes full Nazi

Microsoft created an AI chatbot for Twitter, Kik, and GroupMe, but it pulled the plug after Tay was posting racist and genocidal tweets. Tay has nearly 100,000 tweets, and was only alive for a day before the Redmond company turned it off.

The post Microsoft temporarily kills AI chatbot Tay after it goes full Nazi appeared first on Digital Trends.

Cool Tech–Digital Trends

Comments are Disabled