Requiem for Tay: Microsoft's AI Bot Gone Bad – The New Stack
Microsoft Research and Bing release Tay.ai, a Twitter chat bot aimed at 18-24 year-olds » OnMSFT.com
Microsoft's disastrous Tay experiment shows the hidden dangers of AI — Quartz
Tay the 'teenage' AI is shut down after Microsoft Twitter bot starts posting genocidal racist comments that defended HITLER one day after launching | Daily Mail Online
Twitter taught Microsoft's AI chatbot to be a racist asshole in less than a day - The Verge
taytweets hashtag on Twitter
Microsoft's Rogue Chat Bot 'Tay' Makes Brief Return to Twitter - ABC News