It took less than 24 hours for Twitter to corrupt an innocent AI chatbot.Yesterday, Microsoft unveiled Tay — a Twitter bot that the company described as an experiment in "conversational understanding." The more you chat with Tay, said Microsoft, the smarter it gets, learning to engage people through "casual and playful conversation." Unfortunately, the conversations didn't stay playful for long.
) we can see that many of the bot's nastiest utterances have simply been the result of copying users.
If you tell Tay to "repeat after me," it will — allowing anybody to put words in the chatbot's mouth.
Now, while these screenshots seem to show that Tay has assimilated the internet's worst tendencies into its personality, it's not quite as straightforward as that.
Searching through Tay's tweets (more than 96,000 of them!
Tera switches positions so that she is riding on Anie's tongue and then Anie gets the pussy licking treatment too.