The post reads: “As many of you know by now, on Wednesday we launched a chatbot called Tay.We are deeply sorry for the unintended offensive and hurtful tweets from Tay, which do not represent who we are or what we stand for, nor how we designed Tay.Twitter users effectively taught her to be a giant racist.
Added male voice for modes: Man 18 , Recurrent Neural Network 18 , Viu.* * * ATTENTION * * *Chatbot teached by other people.
As a participant in a conversation, it’s deliberately restrained.
Everything about its conversation style is intended, not to keep the conversation going, .
Anticipating the other side of the conversation is not a new communication problem.
Between humans, communication is incredibly faulty. Our participation in conversations means not just listening to words and parsing them, but subconsciously listening to dozens of other factors—how a person sounds, what they look like, their body language, where you both are.