The chat bot had been sidelined after making racist comments. — -- Microsoft's millennial chat bot "Tay" made a brief reappearance on Twitter this morning but still didn't make a stellar ...
In 2016 Microsoft unleashed a Twitter chatbot named “Tay” onto the world and immediately its machine-learning algorithm turned to racism and sexism. Hey, if you build a bot to mimic how people talk on ...
Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result of nurture, not ...
Hosted on MSN
From the AI Pin to Microsoft Tay: 5 AI products that ended in disaster, and why they failed
With Sam Altman, CEO of OpenAI, and Jony Ive, the designer of the iPhone, about to launch their mysterious new AI product, I thought it might be a good time to look back at AI products that have tried ...
Some results have been hidden because they may be inaccessible to you
Show inaccessible results