Microsoft deeply sorry for offensive tirade by chatbot Tay

Microsoft is “deeply sorry” for the racist and sexist Twitter messages generated by the so-called chatbot it launched this week, after the artificial intelligence program went on an embarrassing tirade.
Microsoft deeply sorry for offensive tirade by chatbot Tay

The bot, known as Tay, was designed to become “smarter” as more users interacted with it.

Instead, it quickly learned to parrot a slew of anti-Semitic and other hateful invective that human Twitter users started feeding the program, forcing Microsoft to shut it down on Thursday.

You have reached your article limit. Already a subscriber? Sign in

Unlimited access starts here.

Try from only €0.25 a day.

Cancel anytime

More in this section

Cookie Policy Privacy Policy Brand Safety FAQ Help Contact Us Terms and Conditions

© Examiner Echo Group Limited