Microsoft deeply sorry for offensive tirade by chatbot Tay

Microsoft is “deeply sorry” for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week, after the artificial intelligence program went on an embarrassing tirade.
Microsoft deeply sorry for offensive tirade by chatbot Tay

The bot, known as Tay, was designed to become “smarter” as more users interacted with it.

Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program, forcing Microsoft to shut it down on Thursday.

Already a subscriber? Sign in

You have reached your article limit.

Subscribe to access all of the Irish Examiner.

Annual €130 €80

Best value

Monthly €12€6 / month

More in this section

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited