Microsoft created a chatbot that tweeted about its admiration for Hitler and used wildly racist slurs against black people before it was shut down.

The company made the Twitter account as a way of demonstrating its artificial intelligence prowess. But it quickly started sending out offensive tweets.

“bush did 9/11 and Hitler would have done a better job than the monkey we have now,” it wrote in one tweet. “donald trump is the only hope we've got.”

Join Independent Minds

For exclusive articles, events and an advertising-free read for just £5.99 €6.99 $9.99 a month

Get the best of The Independent

With an Independent Minds subscription for just £5.99 €6.99 $9.99 a month

Get the best of The Independent

Without the ads – for just £5.99 €6.99 $9.99 a month

Another tweet praised Hitler and claimed that the account hated the Jews.

Those widely-publicised and offensive tweets appear to have led the account to be shut down, while Microsoft looks to improve the account to make it less likely to engage in racism.

The offensive tweets appear to be a result of the way that the account is made. When Microsoft launched “Tay Tweets”, it said that the account would get more clever the more it was used: “The more you chat with Tay the smarter she gets”.

That appears to be a reference to machine learning technology that has been built into the account. It seems to use artificial intelligence to watch what is being tweeted at it and then push that back into the world in the form of new tweets.

But many of those people tweeting at it appear to have been attempting to prank the robot by forcing it to learn offensive and racist language.


Tay was created as a way of attempting to have a robot speak like a millennial, and describes itself on Twitter as “AI fam from the internet that’s got zero chill”. And it’s doing exactly that — including the most offensive ways that millennials speak.

The robot’s learning mechanism appears to take parts of things that have been said to it and throw them back into the world. That means that if people say racist things to it, then those same messages will be pushed out again as replies.

It isn’t clear how Microsoft will improve the account, beyond deleting tweets as it already has done. The account is expected to come back online, presumably at least with filters that will keep it from tweeting about offensive words.

Nello Cristianini, a professor of artificial intelligence at Bristol University, questioned whether Tay’s encounter with wider world was an experiment or a PR stunt.

“You make a product, aimed at talking with just teenagers, and you even tell them that it will learn from them about the world,” he said.

“Have you ever seen what many teenagers teach to parrots? What do you expect?

“So this was an experiment after all, but about people, or even about the common sense of computer programmers.”

Comments

Share your thoughts and debate the big issues

Learn more
Please be respectful when making a comment and adhere to our Community Guidelines.
  • You may not agree with our views, or other users’, but please respond to them respectfully
  • Swearing, personal abuse, racism, sexism, homophobia and other discriminatory or inciteful language is not acceptable
  • Do not impersonate other users or reveal private information about third parties
  • We reserve the right to delete inappropriate posts and ban offending users without notification

You can find our Community Guidelines in full here.

  • Newest first
  • Oldest first
  • Most liked
  • Least liked
Loading comments...
Please be respectful when making a comment and adhere to our Community Guidelines.

Community Guidelines

  • You may not agree with our views, or other users’, but please respond to them respectfully
  • Swearing, personal abuse, racism, sexism, homophobia and other discriminatory or inciteful language is not acceptable
  • Do not impersonate other users or reveal private information about third parties
  • We reserve the right to delete inappropriate posts and ban offending users without notification

You can find our Community Guidelines in full here.

  • Newest first
  • Oldest first
  • Most liked
  • Least liked
Loading comments...