Ars Technica |
Tay, the neo-Nazi millennial chatbot, gets autopsied
Ars Technica A user told Tay to tweet Trump propaganda; she did (though the tweet has now been deleted). Further Reading. Microsoft terminates its Tay AI chatbot after she turns into a Nazi. Setting her neural net processor to read-write was a terrible mistake ... Microsoft 'deeply sorry' for Tay chatbot, will bring it back when 'vulnerability' is fixed Microsoft apologizes for offensive tirade by its 'chatbot' Microsoft Says It's Deeply Sorry For Its Offensive Chat Bot |
from top stories http://ift.tt/1ZCf2tb
No comments:
Post a Comment