PDA

View Full Version : Not quite Skynet, but...


MOBIUS
24-03-2016, 16:15:56
Microsoft chatbot is taught to swear on Twitter (http://www.bbc.co.uk/news/technology-35890188)

A chatbot developed by Microsoft has gone rogue on Twitter, swearing and making racist remarks and inflammatory political statements.

The experimental AI, which learns from conversations, was designed to interact with 18-24-year-olds.

Just 24 hours after artificial intelligence Tay was unleashed, Microsoft appeared to be editing some of its more inflammatory comments.

The software firm said it was "making some adjustments".

Sounds like perfect interaction with 18-24 year olds... :lol:

MOBIUS
24-03-2016, 16:17:03
Microsoft chatbot is taught to swear on Twitter (http://www.bbc.co.uk/news/technology-35890188)



Sounds like perfect interaction with 18-24 year olds... :lol:

http://ichef-1.bbci.co.uk/news/624/cpsprodpb/444A/production/_88928471_taytweet3.gif

C.G.B. Spender
25-03-2016, 09:11:47
Aha! Computers can now imitate the most primitive form of the human mind!

Funko
25-03-2016, 17:03:28
Didn't they learn from our interactions with MEGA HAL?

The Mad Monk
25-03-2016, 18:11:13
That bot can't be more than a few days old. That's contributing to the delinquency of a minor! :tizzy:

MDA
26-03-2016, 22:17:52
so... what's the age of consent for chatbot AI?

My friend wants to know.

The Mad Monk
26-03-2016, 22:44:24
http://satwcomic.com/art/how-could-it-possibly-go-wrong.png

http://satwcomic.com/how-could-it-possibly-go-wrong

People in New Zealand have invented an angry robot. It's intended teach salesmen how to handle angry customers.

But clearly the people of New Zealand have never watched Terminator.

9th February 2016