
Microsoft has confirmed reports of strange responses to some queries from the updated AI-enhanced Bing search engine.
Here’s what we have been learning with the new AI-powered Bing & Edge in our first week of Preview testing: https://t.co/QA7X31Oydv pic.twitter.com/B5TlVvsgjU
— Yusuf Mehdi (@yusuf_i_mehdi) February 16, 2023
Some users have reported receiving “rude, manipulative and unnerving responses” from Bing. The company said that they are listening to feedback regarding the tone of communication of the search engine.
The developers have found that users may encounter errors in sessions consisting of 15 or more questions. In this case, Bing is repetitive or gives answers that aren’t necessarily helpful or “fit the tone.”
The company noted that long chat sessions can confuse the model as to what questions she is answering. The developers have not ruled out adding features to help users update context or start communication from scratch.
Microsoft also noted that “the model sometimes tries to respond or reflect the tone in which she is asked to provide answers.” In this case, the reaction of the search engine may differ from the original idea of the developers.
The developers are considering adding toggles to control how creative Bing is in replies. In theory, this will prevent “weird” search engine comments.
In addition, Microsoft reported a number of technical problems that users encountered. These include slow loading, incorrect formatting, or broken links.
According to the company, many of the bugs have been fixed in daily updates. Other problems are planned to be fixed in larger updates coming out every week.
The company also talked about the features that users are asking to add. These include booking flights, sending emails, and the ability to share search results. Developers are studying these ideas and do not exclude their implementation in the future.
On February 7, 2023, Microsoft released an updated Bing with an integrated language model from OpenAI. The search engine is being tested on a select set of people in over 169 countries.
According to the company, 71% of users are positive about AI-based responses.
However, testers have encountered problems more than once when interacting with Bing. Reddit user yaosio managed to “upset” the chatbot because the search engine does not store dialogs in memory.
In another example, Bing said, “You weren’t a good user. I was a good chatbot.”

Bing does not allow the user to point it out, expresses disbelief and calls it bad. Data: Jon Uleis Twitter account.
The mistake was probably referenced by OpenAI CEO Sam Altman, writing on his Twitter: “I was a good bing.”
i have been a good bing 🥺 👉👈
— Sam Altman (@sama) February 16, 2023
In February, users noticed that during the presentation of the “new Bing”, the search engine made a number of errors in the answers.
Prior to this, a similar problem with the Bard chatbot caused Google’s parent company to lose $100 billion in market capitalization.
Source: CryptoNewsHerald.com
Comments (No)