The #1 Paranormal News Site
It seemed like an ambitious project at first. The team at Microsoft’s Technology, and Research division associated with Bing set out to create a form of Artificial Intelligence similar to the one create by their Chinese counterparts known as Xiaoice. Like Xiaoice this American version was to interact with individuals, answering questions and basically being helpful to users.
Microsoft’s “Tay” AI Corrupted By Social Media?
The American form of this wonder was named “Tay” which stood for “Thinking About You” the bot had the personality of a nineteen-year-old American girl and was designed not only to have knowledge but to absorb knowledge from human interactions.
To give Tay the ultimate test Microsoft decided to release Tay out to Twitter in March of 2016. Microsoft gave the new bot the handle “Tay Tweets” so that individuals could go out to tweet with the new Marvel. At first, all seem to go well, but then as things usually do out on Social Media especially these days with the unrest in the world things turned very ugly. Before long it seemed like Tay was tweeting things such as blacklisting, blaming Bush for September 11th and calling Hitler a better leader then Brack Obama. These are just a few examples of the offensive and ugly stuff the bot was now tweeting out there.
Before long Microsoft had no choice but to remove Tay from Twitter. Then the software giant wrote a blog of major apology for all the things that Tay said out on the social media giant that were offensive to so many. But who is actually to blame here?
According to Roman Yampolskiy, a researcher of Artificial Intelligence, Tay was only repeating the things that it heard out on Twitter. Which is sadly true, since Social Media can get pretty ugly with individuals saying things that maybe they wouldn’t often say out in public. Yes, maybe Microsoft should have placed more safeguards and filters into Tay to ensure that such a thing wouldn’t happen and maybe this is something they will take into account in the future. The thing to remember is this was a test run for Tay, and mistakes were made, but we learn from mistakes. So, for the team at Microsoft, it was back to work, Tay could come back but hopefully smarter and wiser then before.
This article (Tay, The Microsoft AI That Became A Conspiracy Theorist Within Hours) is free and open source. You have permission to republish this article under a Creative Commons license with full attribution and a link to the original source on Disclose.tv