Microsoft today accidentally re-activated “Tay,” its Hitler-loving Twitter chatbot, only to be forced to kill her off for the second time in a week.
Tay “went on a spam tirade and then quickly fell silent again,” TechCrunch reported this morning. “Most of the new messages from the millennial-mimicking character simply read ‘you are too fast, please take a rest,'” according to the The Financial Times. “But other tweets included swear words and apparently apologetic phrases such as ‘I blame it on the alcohol.'”
In other news: Trump hires Tay as his campaign manager.
Microsoft accidentally revives Nazi AI chatbot Tay, then kills it again [Jon Brodkin – Ars Technica]
Via Boing Boing