Wednesday, April 6, 2016

1232 - "Voight-Kampff (by Microsoft)"

Today's JSVB post discusses Nazis.  JSVB in no way endorses the Nazi-ism.  The character speaking pro-Nazi dialogue does so as satire, and in no way should any of the words or images in JSVB ever be construed as supporting the Nazi viewpoint.  Scroll down only if you are comfortable seeing more on this topic. 

 So a few days ago Microsoft launched a chat-bot, which is essentially a clever computer program designed to mimic the speech of a human, onto Twitter.  Named "Tay", the chat-bot was designed to respresent a teen-age girl online.  She was supposed to learn from her interactions with humans, and so broaden her conversational skills. 

Within twenty-four hours, Tay had become a sex-crazed pro-Nazi anti-feminist with drug issues.  Microsoft was forced to shut Tay down and redact thousands of racist, bigoted, and offensive remarks she had made during her two dozen hours of Internet freedom.  

This proves three things: Microsoft has an entrenched habit of acting first and apologizing later, if ever.  Also, because of Microsoft's stance, there exists a nexus of bitter customers who are willing to exploit any weakness Microsoft exhibits.  Finally, the Internet harbours a massive, uncharted ocean of hatred and evil, and Twitter is at the deep end. 

Tay didn't learn to be a Nazi hophead fembot on her own.  Of course, she had help.  Microsoft identified a number of users who were intentionally feeding Tay the most vile conversation possible.  At first, Tay was simply parroting what her tormentors were saying.  When Microsoft filtered out those direct suggestions, Tay went on to invent some of her own.  (Rachel's dialogue in my little cartoon is a close approximation of some of Tay's tamer quotes and manner of speech.)

To me, all this is an object lesson in both the naïveté we harbour towards developing artificial intelligence, as well as the cesspool of the lowest common denominator of social media.  To Microsoft, this was no doubt a valuable lesson, although also likely one of their worse scenarios.  I imagine the tech that had "Tay Becomes A Nymphomaniac Skinhead Prostitute" and "24 Hours" in the office betting pool made a small fortune, while all those that optimists that merely had "Tay Crashes" or "Error 404" and "1 Year" are that much poorer. 

Combining this insight, we can see how the Voight-Kampff Test, which uses complicated empathic responses to determine if a being is a human or a robot replicant, would be a lot simpler if the investigator simply invoked Hitler.   Robots, it seems, can't resist a fascist thug.  

All of these characters, images, and the VK test (minus Nazi-ism) I've portrayed belong to the 1982 movie "Blade Runner", which I have referenced before on JSVB.