微軟的政治正確聊天機器人甚至比它的種族主義者更糟糕
Microsoft’s Politically Correct Chatbot Is Even Worse Than Its Racist One
JULY 31, 2018
Zo採取了一種不妥協的方式來關閉對話。
Zo has an uncompromising approach which shuts down conversation.
Every sibling relationship has its clichés. The high-strung sister, the runaway brother, the over-entitled youngest. In the Microsoft family of social-learning chatbots, the contrasts between Tay, the infamous, sex-crazed neo-Nazi, and her younger sister Zo, your teenage BFF with #friendgoals, are downright Shakespearean.
When Microsoft released Tay on Twitter in 2016, an organized trolling effort took advantage of her social-learning abilities and immediately flooded the bot with alt-right slurs and slogans. Tay copied their messages and spewed them back out, forcing Microsoft to take her offline after only 16 hours and apologize.
https://www.nextgov.com/emerging-tech/2018/07/microsofts-politically-correct-chatbot-even-worse-its-racist-one/150170/
沒有留言:
發佈留言