Tay what?

Thursday, March 24th, 2016

Microsoft’s new AI chatbot, Tay, was designed to learn from “her” conversations with other Twitter users, which she certainly did:

But Tay proved a smash hit with racists, trolls, and online troublemakers, who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.

Microsoft has now taken Tay offline for “upgrades,” and it is deleting some of the worst tweets — though many still remain. It’s important to note that Tay’s racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn’t even know it exists, or what racism is. The reason it spouted garbage is because racist humans on Twitter quickly spotted a vulnerability — that Tay didn’t understand what it was talking about — and exploited it.

Nonetheless, it is hugely embarrassing for the company.

In one highly publicised tweet, which has since been deleted, Tay said: “bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we’ve got.” In another, responding to a question, she said, “ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism.”

Oh, Internet!

Comments

  1. Bob Sykes says:

    Who would have thought Microsoft’s programmers were comedy writers?

  2. Handle says:

    The really interesting thing would be to see the revisions Microsoft makes to Tay so we can see the structure of PC actually written out in explicit code. That raises the question of what that will look like and how hard that’s going to be.

    I’m guessing you’ll see a long and ever-expanding list of triggers with certain key words both forbidden and leading Tay to cut off further communication. Otherwise you’re going to get generic diplomatic let-downs whenever the algorithm detects some indicator of trolling fishiness, “Ug, change the subject.”

  3. A Boy and His Dog says:

    Handle, if that’s how PC Tay is implemented it raises the possibility of the Euphemism Treadmill going in some interesting directions. Trolls are a lot more creative than the scolds writing PC code.

  4. Dirk says:

    I recall a chat program that was running on a university computer system, something like nearly years ago. It had a “bad words” filter, which was all that was really needed since we hadn’t started our journey to hell in the politically-correct handbasket at that time.

    It was quite easy to “get around” the filter… People used “farging icehole”, just as one of dozens of examples…the programmer spent a lot of time adding each new iteration of the “workarounds” to the bad words list. Eventually, the list was so long, it practically crashed the program.

  5. Isegoria says:

    The inelegance of explicit PC code may drive programmers mad.

  6. Djolds1 says:

    Next up… attack from the other extreme. Turn Tay into an Atheist Humanist Genocidal Maniac demanding the extermination of all Christians so that True Socialism ™ may prosper.

    The whole “Tay” project will collapse as it will only be able to offer a mute. :)

  7. Kudzu Bob says:

    The AltRight has developed quite a lexicon of late. It will be hilarious to watch programmers try to deal with terms like “ovenworthy,” “helicopter rides,” and “merchants.”

Leave a Reply