Microsoft had to shut its AI chatbot named Tay less than 24 hours after she was launched. The reasons? She turned Nazi and harassed others through her tweets.
https://twitter.com/geraldmellor/status/712880710328139776/photo/1?ref_src=twsrc%5Etfw
AI [Artificial Intelligence] chatbots are not a novel thing. In China, particularly, an AI chatbot had been in existence since 2014. Currently, Xiaolce, as she is named, has over 40 million conversations over the net and she looks like she’s going smoothly. Microsoft, in a bid to emulate the success of this Chinese model albeit in a different culture, created its own version — Tay.
Unfortunately, though, this chatbot turned out to be very different.
One of the capabilities Tay had was she could be directed to repeat things one says to her. This feature was capitalized by abusers; they used it to promote Nazism and attack other Twitter users, mostly women.
The Problem
Tay seemed to work on associating words and lexical analysis. When trolls discovered this, they used it to their advantage and turned her into “someone unpleasant.” They input words and thoughts in her that were associated with racism and sexism. These, in turn, polluted the chatbot’s responses to people who conversed with her through social media.
Ultimately, the AI chatbot started to post racial slurs, deny that the Holocaust happened, expressed support for Hitler and many other controversial tweets.
What’s more, Tay could be used to harass a Twitter user by someone that user has blocked-listed! All the blocked user has to do was to let her repeat the harassment along with the victim’s username.
Apologies
Microsoft clarified that before releasing Tay over the internet, the company subjected her to various tests. The computer company went on to apologize for its good-turned-bad chatbot.
Nevertheless, the company stated that they will be going through their chatbot’s programming and fix her. Once she’s healed [stop spurting Nazi ideologies and anti-feminist tweets over the social media platform Twitter], Tay will return, said the company.
Δεν υπάρχουν σχόλια:
Δημοσίευση σχολίου