The chat bot had been sidelined after making racist comments. — -- Microsoft's millennial chat bot "Tay" made a brief reappearance on Twitter this morning but still didn't make a stellar ...
Microsoft has apologised for creating an artificially intelligent chatbot that quickly turned into a holocaust-denying racist. But in doing so made it clear Tay's views were a result of nurture, not ...
Hosted on MSN
From the AI Pin to Microsoft Tay: 5 AI products that ended in disaster, and why they failed
With Sam Altman, CEO of OpenAI, and Jony Ive, the designer of the iPhone, about to launch their mysterious new AI product, I thought it might be a good time to look back at AI products that have tried ...
Results that may be inaccessible to you are currently showing.
Hide inaccessible results