With AI support, Microsoft finally wants to compete with Google. Since its launch, users have wondered why the chatbot calls itself Sidney. Now the group has revealed the secret.
It was supposed to be the big liberation: At a presentation two weeks ago, Microsoft presented its Internet search of the future – including AI support and a chatbot based on the strong AI ChatGPT. Since then, users have wondered about the name the bot gave itself. Now the secret has been revealed.
“Sidney is an old codename for a chat feature that we’ve been testing in India since 2020,” a Microsoft spokesman told The Verge. “The experience we gained from this has now also helped us when working on the new Bing preview.”
A chatbot named Sidney
In fact, by the end of 2021, many users in China and India noticed that the Bing chatbot identified itself as Sidney. A corresponding question can also be found in Microsoft’s support forum. A person there tried to find Sidney again after disappearing from the browser after a test run.
However, Sidney only became famous in the last few weeks. After Microsoft made the chatbot accessible to a small group of select people, they quickly began to explore the strengths and weaknesses of the AI behind it – and in the process also came across the fact that the bot occasionally referred to itself as Sidney.
Short leash for ChatGPT
However, Sidney didn’t behave quite the way Microsoft envisioned. Being involved in highly complex conversations by the users, the bot allowed itself to be carried away with all sorts of problematic statements. So he insulted users, confessed his love to others. And diagnosed herself with emotional issues and a dangerous addiction to using excessive emojis. Here you will find numerous absurd, funny and slightly disturbing examples.
Microsoft ultimately felt compelled to pull the ripcord. Because the bot threatened to become more and more confused with increasing session length and more and more questions and then let itself be provoked into aggressive, insulting or otherwise problematic answers, a clear limit was decided on: Each session can now have a maximum of five questions, each user can ask a maximum of 50 questions to the bot per day. Since then, Sidney seems to be better behaved.
Sources: The Verge, Microsoft Community