Advertisement

Microsoft's Twitter A.I. became a Holocaust-denying racist within 24 hours of going live

On Wednesday, Microsoft released @TayandYou on to Twitter, their artificial intelligence bot that...
Newstalk
Newstalk

18.25 25 Mar 2016


Share this article


Microsoft's Twitter A....

Microsoft's Twitter A.I. became a Holocaust-denying racist within 24 hours of going live

Newstalk
Newstalk

18.25 25 Mar 2016


Share this article


On Wednesday, Microsoft released @TayandYou on to Twitter, their artificial intelligence bot that was designed to understand conversational understanding.

The goal was for Tay to mimic the language used by it's fellow Twitter users, and perform a series of automated discussions.

Within 24 hours @TayandYou, Microsoft took @TayandYou off Twitter. Why, you might ask? Well...

Advertisement

Yep, within one day, Tay had interacted with enough human Twitter folk that it's automated discussions became filled with racist, sexist, Holocaust-denying statements such as those above.

Microsoft did the best they could to delete them, but it being the internet, they are now out there forever.

The company believes that some savvy online trolls tricked Tay's reactionary software into believing that these offensive statements were as acceptable as saying "Hello" or "How are you?", which is how the account quickly spiraled out into some of the tweets above.

Not to worry though, as Tay's last tweet before Microsoft took her back offline for a bit of a tune-up promises a return sooner rather later...


Share this article


Read more about

Business

Most Popular