Microsoft deeply sorry for offensive tirade by chatbot Tay
The bot, known as Tay, was designed to become “smarter” as more users interacted with it.
Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program, forcing Microsoft to shut it down on Thursday.




