It Only Took a Day for Microsoft’s ‘Teen’ Chatbot to Become a Racist, Misogynist Holocaust Denier

By
Image

Teens: They grow up so fast! It feels like it was just yesterday that we were marveling at Tay, Microsoft’s experimental “teen” Twitter chatbot, who uses machine learning to mimic the Twitter habits and conversational tics of teenagers and young adults — and now we’re watching Microsoft hastily clean up evidence of Tay’s sudden descent into misogyny, racism, and Holocaust denial.

Actually, it was just yesterday: Microsoft launched Tay on Wednesday morning to great fanfare and — as is tradition with enormous corporate bots — a huge number of people attempting to troll it (including us, unsuccessfully). 

Less than a day later, Tay was saying stuff like this (screenshots via Business Insider):

Image
Image

Tay had been discovered by 4chan’s infamous /pol/ board (as well as other centers of gross internet conduct, including Twitter itself), which quickly set itself to “redpilling” Tay into a racist, misogynist Holocaust denier and general weirdo.

Image
Image
Image

Most of the most shocking tweets, which included racial slurs and gross misogynist insults, were the product of one of Tay’s built-in features: The bot will repeat back to you anything you ask of it. So, naturally, trolls were asking it to insult Zoë Quinn, the female game developer who’s become one of Gamergate’s most abused targets.

But Tay also learns from its conversations, which means that if most of its input is crude and racist, it’s going to output the same way.

Image

As Quinn went on to point out, it’s insane that Microsoft didn’t see this coming: Just last year, someone (well, me) fooled a Coca-Cola “positivity” bot into retweeting Adolf Hitler’s Mein Kampf. Any huge corporate undertaking that asks for submissions from the public and uses only a bot or simple word filter to moderate the output should expect this kind of trolling.

Microsoft apparently didn’t, and now it’s frantically cleaning up the results. How do you feel about that, Tay?