Racist AI chatbot "Tay" talks about smoking weed before being killed off again

Microsoft's Artificial Intelligence bot has been causing havoc online

Microsoft's controversial artificial intelligence bot made a short-lived return to Twitter on Wednesday, before promptly being pulled from public life once again. The bot had been designed to mimic the musings of a teenage girl and went live just one week ago for the first time.

Things when downhill very quickly, however, when the bot received a crash-course in racism, sexism and Holocaust denial courtesy of Twitter users. Once the bot started engaging and proclaiming hate messages, Microsoft killed it off and cleared the offensive messages, before re-releasing Tay to the world on Wednesday 30th March. 

Tay didn't fair much better the second time around

Microsoft were again forced to shut Tay away from public life by making the profile private and preventing anyone from seeing the tweets.