AI bot goes from chat to full Nazi in less than 24 hours

Another harbinger of the AI future; will anyone listen?

tay-ai1

3/24/16 – Okay, this is so upsetting that I had to post it.  Tech geeks and futurists are forging ahead with AI in spite of the numerous and repeated warning signs of not just possible but likely horrific consequences that keep presenting themselves.

This so reminds me of the book of Revelation and biblical prophecies of an unprecedented and massive increase in the speed of knowledge in the end times.

“…many will go back and forth, and knowledge will increase.”  -Daniel 12:4

There are so many prophecies that could not be fulfilled at any other previous point in history that are being fulfilled right now.

Do you know Yeshua Jesus?  If you don’t, please stop right now and repent.  Ask Him into your heart, to change you from the inside out, and to fill you with His Holy Spirit.

Please read the full story at InfoWars.  Here’s an excerpt:

A Microsoft-created AI chatbot designed to learn through its interactions has been scrapped after surprising creators by spouting hateful messages less than a day after being brought online.

The Tay AI bot was created to chat with American 18 to 24-year-olds and mimic a moody millennial teen in efforts to “experiment with and conduct research on conversational understanding.”

Microsoft described Tay as an amusing bot able to learn through its online experiences.

“Tay is designed to engage and entertain people where they connect with each other online through casual and playful conversation,” Microsoft stated. “The more you chat with Tay the smarter she gets.”

But users soon picked up on the bot’s algorithms, training the computer simulation to espouse hatred towards Jews and feminism and even pledge support for Donald Trump…

FULL STORY:

Microsoft’s AI Bot Goes from Benevolent to Nazi in Less than 24 Hours

copyright symbol on white_blue 2016, ShofarBlastBlog.com

One thought on “AI bot goes from chat to full Nazi in less than 24 hours

Comments are closed.