For the past few days the internet has revelled in the precipitous downfall of Microsoft’s chatbot, Tay. This software-generated teen was hyped by its creators for the intelligent algorithms that would make it progressively smarter, the more it chatted to human beings. Well, Tay certainly became more something – but it wasn’t smart. Within a few days, prompted by persistent needling from Twitter users, the bot began to produce anti-semitic and sexist rants. Then, for good measure, it started extolling the virtues of one Donald J Trump. Familiar stuff from human users of social media, but it was rather striking to see these views expressed by a robot.
A few days later, Tay returned, repatched and instructed to play nicely this time; but in a matter of hours, it had to be taken down yet again, having descended this time into a drug-fuelled meltdown.