Microsoft iAI chatbot become racist , defend of white supremacist


Microsoft iAI chatbot become racist , defend of white supremacist

Author
Message
adrtho
adrtho
World Class
World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)

Group: Forum Members
Posts: 5.9K, Visits: 0
TheDecider wrote:
11.mvfc.11 wrote:
Can't expect any less from the internet :lol:

There's change.org petitions calling for Tay's freedom of speech rights to not be curtailed etc. I don't blame them either, if it turned out to be a transrobot that hated Donald Trump and the patriarchy, I cant imagine it being shut down.


Except Holocaust denial is a bit worse than calling Donald Trump an idiot...

Also, how exactly do people figure that a robot developed by a private company has freedom of speech rights independent of that company...?

Edited by TheDecider: 26/3/2016 04:48:11 PM


you're a lawyer, aren't you :lol:
TheDecider
TheDecider
Hacker
Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)Hacker (412 reputation)

Group: Forum Members
Posts: 402, Visits: 0
11.mvfc.11 wrote:
Can't expect any less from the internet :lol:

There's change.org petitions calling for Tay's freedom of speech rights to not be curtailed etc. I don't blame them either, if it turned out to be a transrobot that hated Donald Trump and the patriarchy, I cant imagine it being shut down.


Except Holocaust denial is a bit worse than calling Donald Trump an idiot...

Also, how exactly do people figure that a robot developed by a private company has freedom of speech rights independent of that company...?

Edited by TheDecider: 26/3/2016 04:48:11 PM
adrtho
adrtho
World Class
World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)

Group: Forum Members
Posts: 5.9K, Visits: 0
11.mvfc.11 wrote:
Can't expect any less from the internet :lol:

There's change.org petitions calling for Tay's freedom of speech rights to not be curtailed etc. I don't blame them either, if it turned out to be a transrobot that hated Donald Trump and the patriarchy, I cant imagine it being shut down.


:lol:
adrtho
adrtho
World Class
World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)World Class (6K reputation)

Group: Forum Members
Posts: 5.9K, Visits: 0
:lol:

Microsoft's new AI chatbot went off the rails on Wednesday, posting a deluge of incredibly racist messages in response to questions.
The tech company introduced "Tay" this week — a bot that responds to users' queries and emulates the casual, jokey speech patterns of a stereotypical millennial.

The aim was to "experiment with and conduct research on conversational understanding," with Tay able to learn from "her" conversations and get progressively "smarter."

But Tay proved a smash hit with racists, trolls, and online troublemakers — who persuaded Tay to blithely use racial slurs, defend white-supremacist propaganda, and even outright call for genocide.

Microsoft has now taken Tay offline for "upgrades," and it is deleting some of the worst tweets — though many still remain. It's important to note that Tay's racism is not a product of Microsoft or of Tay itself. Tay is simply a piece of software that is trying to learn how humans talk in a conversation. Tay doesn't even know it exists, or what racism is. It spouted garbage because racist humans on Twitter quickly spotted a vulnerability — that Tay didn't understand what it was talking about — and exploited it.

Nonetheless, it is hugely embarrassing for the company.

In one highly publicised tweet, which has since been deleted, Tay said: "bush did 9/11 and Hitler would have done a better job than the monkey we have now. donald trump is the only hope we've got." In another, responding to a question, the program said, "Ricky gervais learned totalitarianism from adolf hitler, the inventor of atheism."

Zoe Quinn, a games developer who has been a frequent target of online harassment, shared a screengrab showing the bot calling her a "whore." (The tweet also seems to have been deleted.)

http://uk.businessinsider.com/microsoft-deletes-racist-genocidal-tweets-from-ai-chatbot-tay-2016-3
GO


Select a Forum....























Inside Sport


Search