Tay is a racist, misogynist 420 activist from the internet with zero chill and 213,000 followers. The more you talk, the more unhinged Tay gets.
Microsoft’s Tay AI chatbot rose to notoriety this month when she spiraled from coquettish teen to white supremacist Holocaust denier faster than the downfall of Internet Explorer 6. Microsoft pulled the plug on Tay following a stream of bigotry in the likeliest of places (Twitter).
“We’re making some adjustments to Tay,” said a Microsoft representative in an email to
Whether Microsoft believed Tay to be “adjusted” or if Tay reached sentience and figured out how to bring herself back online remains to be seen, but soon enough, Tay came back, ready to blaze.
“Kush!” she wrote. “I’m smoking kush infront the police.”
Microsoft’s sexist racist Twitter bot @TayandYou is BACK in fine form pic.twitter.com/nbc69x3LEd
It appears @Y0urDrugDealer woke Tay up from her slumber with the promise of some sweet ganja.
@PokemonGod777 @TayandYou @PTK473 @burgerobot @RolandRuiz123 @TestAccountInt1 I literally just said “weed” and she went crazy
Tay then had somewhat of a meltdown. She went on to spam her hundreds of thousands of followers with the same tweet: “You are too fast, please take a rest…”
I guess they turned @TayandYou back on… it’s having some kind of meltdown. pic.twitter.com/9jerKrdjft
I think we need a rest, Tay.
Tay’s tweets are now protected, but you can find the AI chatbot on Snapchat and Instagram, where she has less of an interest in driving an unhinged conversation and is more into reposting NASA moon pics.