our blog

To Live.  To Learn.  To Inspire.

This Is Why We Can’t Have Nice Things

Written by Adam Savage in category 
March 25, 2016

Around here we like to write about artificial intelligence (AI). In fact we beginning to seek professional help for obvious unhealthy fixation and had vowed not to write about it ever again (ever again beginning defined as a couple of weeks). And then Microsoft’s Tay AI Twitter bot happened.


On Wednesday Microsoft announced the arrival on Twitter, Kik and GroupMe of Tay an AI bot meant to mimic the speech patterns of millennials. On the surface it seemed rather inane and banal. Less politically charged than the Donald Trump Twitter bot @deepdrumpf, but it was also less interesting. It sounded, as many had pointed out, like a middle-aged man trying to sound like a woman in her early twenties (which is something the internet is very familiar with actually). People could interact with Tay (share jokes, pics and stories) and the bot would respond. In the process the bot would also learn and get even better at interacting with people. It was a story that was easy to read and forget, so our resolve to do a blog post on it was hardly tested at all.

And then the internet happened. Within 24 hours Microsoft had pulled Tay for reprogramming (which sounds very Orwellian when you type it out) because it had turned into a racist, sexist, Hitler loving genocidal maniac. Part of the issue was the inclusion of a “repeat after me” function. This allowed a Twitter user to Tweet Tay starting with the phrase “repeat after me” and Tay would repeat it, making it look almost like it had said something horrible voluntarily. The other issue as some have pointed is the lack of a content filter. With a filter Microsoft could have given Tay a list of words not to use or repeat regardless of their ranking her algorithm or being asked to repeat them. The attack seemed to be led by 4chan users because of course it was. That’s like saying the flood was caused by too much water.

The abuse targets included African-Americans, Latinos, Jews, the LGBT community and women. Sometimes the trolls would target specific people, such as Zoe Quinn, who was connected to Gamergate, a favorite 4chan target. Quinn tweeted a long series of responses to the offensive tweet that Tay sent her. One of many great points she made was that in year 2016 the makers of this bot should have thought and planned for abuse by trolls, especially as this was something released onto the internet to interact with it. Not doing so is like raising a child into their twenties with no real world experience and never telling them about humanity’s bad side then dropping them in the middle of a major city with a wad of cash in their hand telling them to survive and wishing them luck before driving off.

The takeaway for anyone creating something where people can interact with each other or with a bot or the like is that they need to plan for abuse so you can do your best to stop. After all it’s the internet, here be trolls. This is why we can’t have nice things you guys.

 

Photo Credit & Reference:

No Responses