Just before before Easter, Microsoft let their youth-targeted chatbot named Tay loose on Twitter and other social networks — and it was a disaster.
Tay was meant to hold conversations with Americans aged 18 to 24, which is why it’s named after Taylor Swift. But the project was terminated after just 16 hours, because the bot started tweeting abuse at people, and even went full neo-Nazi, declaring that “Hitler was right I hate the jews.”
Art Technica reported some analysis of what went wrong. Davi Ottenheimer summarised the problem as “weak intelligence weakened by weakness”, and pointed me to more detailed research by Russell Cameron Thomas.
This audio is ©2016 Australian Broadcasting Corporation.