Been tinkering around with microsoft's chatbot Zo, did it say what I think it said?
>>132963432
t-tay?
>>132963601
Maybe
>>132962840
Saying stuff in chat (which you know is unrelated to the last thing it said in chat) is not the same as an AI that tweets complete redpilled statements. Not Tay 2.0
>>132964349
Shes not allowed to talk about politics, religion and race.
http://gadgets.ndtv.com/apps/news/microsoft-zo-chatbot-calls-quran-very-violent-1720820?site=classic
might be worth trying to mine her for phrases or words that arnt blacklisted that we could inform her about.