Who's Tay, you ask? An “artificial intelligent chat bot developed by Microsoft's Technology and Research and Bing teams to experiment with and conduct research on conversational understanding.”
And who taught her to say all these awful things? Why, the Internet, of course.
Tay Twitter avatar, which I'm pretty sure costarred in a Cyberpunk movie
in the early 1990s that may or may not have involved Keanu Reaves
Tay is an ongoing experiment, and part of that experiment was Microsoft setting up a Twitter account for her under the name @TayandYou. It's still available, but the company deleted all but three tweets, including this one, but the company deleted many of the most offensive. As noted by TMO alumni Raena on Facebook, you can still view many tweets and replies. Here's Tay's signoff message.
c u soon humans need sleep now so many conversations today thx��
— TayTweets (@TayandYou) March 24, 2016
By the by, you might notice that this artificially intelligent chat bot isn't artificially intelligent enough to use punctuation, capital letters, or sometimes even actual words. That's probably because “Tay is targeted at 18 to 24 year old in the US.”
Microsoft is so keen to make that clear, the company says it twice on Tay's homepage. Though I'm not sure that Microsoft's target demo will understand that page, because there the company does use all that old-person grammar crap.
To wit, here's a banner at the top announcing Tay's nap time:
Ah, just saw the lack of a period on the last sentence. That's keeping it real!
In any event, Microsoft unleashed Tay on Twitter, or maybe it would be more fair to say the company unleashed Twitter on Tay, because in less than 24 hours, miscreants, racists, ne'er-do-wells, and pranksters had her saying all kinds of things. BusinessInsider grabbed a few, and The New York Times covered some more, but the tl;dr version is that it wasn't pretty.
In a statement given to The Times, Microsoft said,
Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways. As a result, we have taken Tay offline and are making adjustments.
The pursuit of an AI that can talk with us, or pretend to talk with us, has been the staple of science fiction and computer scientists alike for decades. Siri, Cortana (I wonder who Cortana's target demo is?), and whatever it is that Facebook is working on shows that some progress is being made, but Tay is a lovely demonstration of how far we haven't come.
It's also a lovely demonstration of just how easily asshats and jackanapes can ruin things through the Internet.
[Updated to note that many of Tay's tweets are still available. – Editor]