Microsoft temporarily kills AI chatbot Tay after it goes full Nazi
On March 25, 2016 by Michelle Turner With 0 Comments
- Tech
Microsoft created an AI chatbot for Twitter, Kik, and GroupMe, but it pulled the plug after Tay was posting racist and genocidal tweets. Tay has nearly 100,000 tweets, and was only alive for a day before the Redmond company turned it off.
The post Microsoft temporarily kills AI chatbot Tay after it goes full Nazi appeared first on Digital Trends.
Comments are Disabled