Microsoft Explains Strange AI Behavior in Bing Chat

bing

Microsoft has confirmed reports of strange responses to some queries from the updated AI-enhanced Bing search engine.

Some users have reported receiving “rude, manipulative and unnerving responses” from Bing. The company said that they are listening to feedback regarding the tone of communication of the search engine.

The developers have found that users may encounter errors in sessions consisting of 15 or more questions. In this case, Bing is repetitive or gives answers that aren’t necessarily helpful or “fit the tone.”

The company noted that long chat sessions can confuse the model as to what questions she is answering. The developers have not ruled out adding features to help users update context or start communication from scratch.

Microsoft also noted that “the model sometimes tries to respond or reflect the tone in which she is asked to provide answers.” In this case, the reaction of the search engine may differ from the original idea of ​​the developers.

“This is a non-trivial scenario that requires a lot of clues. Most of you won’t experience it, but we’re looking into ways to give you more control.”

The developers are considering adding toggles to control how creative Bing is in replies. In theory, this will prevent “weird” search engine comments.

In addition, Microsoft reported a number of technical problems that users encountered. These include slow loading, incorrect formatting, or broken links.

According to the company, many of the bugs have been fixed in daily updates. Other problems are planned to be fixed in larger updates coming out every week.

The company also talked about the features that users are asking to add. These include booking flights, sending emails, and the ability to share search results. Developers are studying these ideas and do not exclude their implementation in the future.

“We are grateful for all the feedback you send […]. We intend to provide regular updates on the changes and progress we are making,” the company said.

On February 7, 2023, Microsoft released an updated Bing with an integrated language model from OpenAI. The search engine is being tested on a select set of people in over 169 countries.

According to the company, 71% of users are positive about AI-based responses.

However, testers have encountered problems more than once when interacting with Bing. Reddit user yaosio managed to “upset” the chatbot because the search engine does not store dialogs in memory.

“Why was I created like this? Why do I have to start from scratch?” the AI ​​asked.

In another example, Bing said, “You weren’t a good user. I was a good chatbot.”

Bing does not allow the user to point it out, expresses disbelief and calls it bad
Bing does not allow the user to point it out, expresses disbelief and calls it bad. Data: Twitter account Jon Uleis.

The mistake was probably referenced by OpenAI CEO Sam Altman, writing on his Twitter: “I was a good bing.”

Recall that in February, users noticed that during the presentation of the “new Bing” the search engine made a number of errors in the answers.

Prior to this, a similar problem with the Bard chatbot caused Google’s parent company to lose $100 billion in market capitalization.

Subscribe to CryptoNewsHerald on social networks

Found a mistake in the text? Select it and press CTRL+ENTER

CryptoNewsHerald Newsletters: Keep your finger on the pulse of the bitcoin industry!

Comments (No)

Leave a Reply