Women’s Super League Football rebrands
There's a bunch of rebranding of women's football in the UK and they made a fancy
By now you've heard of Tay.Ai, the Artificial Intelligence chat robot that Microsoft released out on Twitter, Kik and GroupMe. For whatever reason Microsoft thought in their infinite wisdom that creating a "teenage girl" chatbot and putting her on said platforms to "learn the lingo" wasn't a tremendously bad idea. A mere day later they had to delete her tweets as Tay had learned to hate Hillary Clinton, love Hitler, and accused Cruz for being a Cuban Hitler. There were some mention of 9/11 and steel beams as well. Because of course Godwin's law would happen if you let the world "teach" your AI how to speak. More interesting questions, such as "do you secretly collaborate with the NSA?" were left unanswered. The Independent, Business Insider and The Telegraph reported on the events.
Now, you all remember that Gawker trolled Coca Cola's "happy" Twitter bot, and they defended this. Later they set up a Twitter-bot that spouted Mussolini quotes, only to entrap Trump into retweeting one as a "gotcha", so it's not like we don't have experience with bots being trolled and bots being used to troll on Twitter. I mean, what exactly did Microsoft expect? It's Twitter, after all.
It's not Microsoft's first chatbot either. They launched Xiaoice, a girly assistant that is reportedly used by 20m people, on Chinese social networks WeChat and Weibo. She's the little sister of Cortana who lives in Windows phones, but XiaoIce is a sophisticated conversationalist with a distinct personality who can chime in with facts and trivia in a more human way. Of course, we all know Siri who lives in Apple's phones, but she only speaks when spoken to and barely even then.
To get to these chatty levels of chat-botting, the AI needs to learn but Twitter, being what it is today, had to poke sticks at the AI, and pretty much everyone joined in. Any program can only be as clever as the person who wrote it, and from what I saw Tay simple repeated much of what was said to it, so it's little wonder people made it yell out offensive things, or sent it images of various people such as Trump, Snowden, Hitler, and old memes to see how Tay would react.