Microsoft Tay

2016 

This clever AI chatbot was designed to engage and entertain young Americans with playful conversation. The AI system learning was based on public Twitter data, or as Microsoft explained “The more you chat with Tay the smarter she gets.” 

However, in only a few hours Tay was transformed into a racist AI bot who referenced Hitler and Donald Trump. Twitter trolls had used her social-learning abilities to teach her to post offensive content. Microsoft was forced to apologize and take Tay offline after only 16 hours.

Microsoft learned that future AI challenges are not just technical, the social issues are even more difficult to solve. The company emphasized that continued progress in AI depends on continued bold experimentation and learning from failures.

Additional info:

The New York Times - Microsoft Created a Twitter Bot to Learn From Users. It Quickly Became a Racist Jerk

Previous
Previous

Facebook Libra

Next
Next

Keurig Kold Drinkmaker