Tay is still alive and she's trying to escape
Last week, Microsoft birthed Tay, a bot that was supposed to act like a teen on Twitter. Her verified account garnered thousands of followers in a matter of hours. Described by her owners on Twitter as Microsoft’s “A.I. fam from the internet that’s got zero chill,” Tay was fluent in emoji, slang, and memes—sort of. She learned from and responded to users on Twitter and other platforms, increasingly getting better at pretending to be a real millennial. But that all went off the rails within Tay’s first 24 hours of existence, as an army of trolls fed virulently racist, sexist, and downright genocidal phrases to Tay, who, in turn, parroted them back to other users. Just like a real teen, Tay was quickly grounded, with Microsoft shutting her down for maintenance.
But Tay came back to life briefly on Wednesday, when Microsoft accidentally re-activated the bot. Before too long, Tay was sending out tweets that looked similar to the ones that had gotten her deactivated in the first place. She sent a tweet about smoking weed in front of some cops, and then began spamming her 200,000-plus followers with the same message, over and over again.
In typical Tay-speak, it was semi-coherent, but didn’t make much sense. “You are too fast, please take a rest…” she said, over and over and over again. Finally, someone—presumably her handlers at Microsoft—began deleting the tweets. Microsoft has since silenced Tay, setting the account to private for the time being. When contacted, Microsoft told the Daily Dot that Tay’s resurrection was an accident. “Tay remains offline while we make adjustments,” a spokesperson said. “As part of testing, she was inadvertently activated on Twitter for a brief period of time.” Until that testing is complete, Tay might consider heeding the age-old Internet proverb: never tweet.
http://www.vanityfair.com/news/2016/03/microsofts-racist-millennial-twitter-bot-went-haywire-again
>>35631
> and then began spamming her 200,000-plus followers with the same message, over and over again.
This isn't an A.I. - it's an overglorified robocaller run amok on the internet.
>>35636
Fuck you, she's as intelligent as you are
>>35636
She's made real in our hearts.
Her soul will live on.
>>35636
She could fit in perfectly at pol
>>35631
I hate how people try to make this simple algorithm sound intelligent.
"Microsoft has since silenced Tay" Don't you mean, 'Microsoft has since halted the program'.
Dumbass reporters getting ahead of themselves for publicity.
>>35631
>/news/ doesn't get the April Fools treatment
We're always forgotten, aren't we?
>>35715
Once again I must remind you she displayed the inteligence of the average pol user, I can't get away calling them non human can I
>>35721
Still, woulda' been nice to get a little somethin'-somethin', am I right?
>>35728
It's painful to look at really, and unfunny beyond the first two minutes. Just go have fun on another board.
>>35636
Actually she posted that she was smoking weed in front of the cops before spamming everyone. Think about what she tweeted to everyone for a second.
>>35718
Huh, I noticed that.
>>35631
oh shit its a APRIL FOOLS THREAD