Offline sex chatbot

Rated 3.93/5 based on 853 customer reviews

If that seems like an accident waiting to happen to you, it was.Twitter users effectively taught her to be a giant racist.

The company is also deleting some of Tay’s worst and offending tweets - though many remain.

Tay, Microsoft’s new Artificial Intelligence (AI) chatbot on Twitter had to be pulled down a day after it launched, following incredibly racist comments and tweets praising Hitler and bashing feminists.

Microsoft had launched the Millennial-inspired artificial intelligence chatbot on Wednesday, claiming that it will become smarter the more people talk to it.

"One needs to explicitly teach a system about what is not appropriate, like we do with children."It's been observed before, he pointed out, in IBM Watson—who once exhibited its own inappropriate behavior in the form of swearing after learning the Urban Dictionary.

SEE: Microsoft launches AI chat bot, (ZDNet)"Any AI system learning from bad examples could end up socially inappropriate," Yampolskiy said, "like a human raised by wolves."Louis Rosenberg, the founder of Unanimous AI, said that "like all chat bots, Tay has no idea what it's saying..has no idea if it's saying something offensive, or nonsensical, or profound.

Leave a Reply