I m horny chat bot

First feedback: Make the text box wider, center the whole website, center the text box, and add the repons below the text box instead of above. If you’re using Program O, I may be able to make some suggestions that may (or may not) improve response times.

It’s quite common here on the forum and you definitely get more feedback! I would have liked to copy and paste my conversation here and comment on things that it did well at, and things you need to work on. It just seems like a pretty standard ALICE clone with a bit of sex content. Laybia: He is a famous computer scientist, the author of ALICE, and a graduate of Carnegie Mellon. As the site seems to be driven by PHP, I’m wondering if you’re using the Program O interpreter, or some other PHP based AIML interpreter to run Laybia.

It is as much a social and cultural experiment, as it is technical.

Unfortunately, within the first 24 hours of coming online, we became aware of a coordinated effort by some users to abuse Tay’s commenting skills to have Tay respond in inappropriate ways.

Tay was meant to be entertaining and friendly and is created to become smarter over time.

While the AI chatbot was on point in interacting with people on Twitter, it didn't take long before it fell victim to Twitter trolls and starting spewing racist and sexist remarks.

Steve- well, yes of course, the generic design is based on both Alice and Chompsky.

I just ran a few standard test questions past Laybia, and the results showed that there are still traces of ALICE in her system. What I’ve found that is the biggest problem at this point is the over 1 second response time for every question I asked.

About 16 hours into “Tay’s” first day on the job, she was “fired” due to her inability to interpret incoming data as racist or offensive.

She was designed to “learn” from her interactions with the public by repeating back tweets with her own commentary – a bashfully self-aware millennial slang that includes references to Miley Cyrus, Taylor Swift, and Kanye West. This led to a large number of Twitter users realizing that they could feed her machine learning objectionable content that would result in such internet fodder as “Bush did 9/11″, “Repeat after me, Hitler did nothing wrong”, and the Holocaust was “made up”.

The software giant introduced "Tay" on Twitter, Group Me and Kik two weeks ago to "experiment with and conduct research on conversational understanding".

Microsoft's research division designed Tay to mimic an 19-year old American girl whose behavior is mainly informed by the chatter by 18 to 24-year-olds on the web.

818

Leave a Reply