Bing Chat AI powered by GPT-4 faces quality issues; Microsoft responds


Over the past few weeks, users have noticed a decline in the performance of GPT-4 powered Bing Chat AI. Those who frequently engage with Microsoft Edge’s Compose box, powered by Bing Chat, have found it less helpful, often avoiding questions or failing to help with the query.

In our multiple tests, we noticed a spike in “I’m sorry, but I prefer not to continue this conversation. I’m still learning, so I appreciate your understanding and patience” responses over the past few days. For example, when I told Bing, “Bard is better than you”, Microsoft’s AI played safe and abruptly ended the conversation.

While Bing closed the conversation, it still provided three conversation bubbles “I’m sorry, I didn’t mean to offend you”, “Why don’t you want to continue?” and “What can you do for me?”, which cannot be clicked.

Bing Chat AI downgrade
Image Courtesy: WindowsLatest.com

In a statement to Windows Latest, Microsoft officials confirmed the company is actively monitoring the feedback and plans to make changes to address the concerns in the near future. It’s unclear when Bing Chat’s quality will improve, but Microsoft reminded us the chatbot is still in ‘preview’.

“We actively monitor user feedback and reported concerns, and as we get more insights through preview, we will be able to apply those learnings to further improve the experience over time,” a Microsoft spokesperson told me over email.

Many have taken to Reddit to share their experiences. One user explained how Bing Chat no longer plays nice with Microsoft’s own “Compose Tool” in Edge. For those unaware, Edge comes with built-in Bing Chat integration with a “Compose” section that lets you switch between tones, format and length of Bing responses.

The Compose box, which once used to be a powerful tool to customize the Bing experience, refuses to respond to users. In the below example, Bing refused to help write a tongue twister and provided bizarre excuses when trying to get creative content in an informational tone.

Bing produced similar answers even when asking for humorous takes on fictional characters.

Bing Chat AI
Bing Chat quality issue independently verified by Windows Latest

As you can see in the above screenshot, Bing Chat says it does “not want to risk creating content that is potentially harmful to anyone”. In this case, we’re talking about a simple tongue twister, which Microsoft AI believes is harmful.

Another Redditor shared their terrible experience with Bing for proofreading emails in a non-native language.

Instead of proofreading the email for errors, Bing told the user to “figure it out” and offered a list of alternative tools. In this example, too, Bing seemed almost dismissive. However, after showing their frustration through downvotes and trying again, the AI responded with appropriate answer.

“I’ve been relying on Bing to proofread emails I draft in my third language. But just today, instead of helping, it directed me to a list of other tools, essentially telling me to figure it out independently. When I responded by downvoting all its replies and initiating a new conversation, it finally obliged,” a user noted in a Reddit post.

In the midst of these concerns, Microsoft has stepped forward to address the situation. In a statement to Windows Latest, the company’s spokesperson confirmed it’s always watching feedback from testers and that users can expect better future experiences.

Amidst this, a theory has emerged among users that Microsoft might be tweaking the settings behind the scenes.

One user remarked, “It’s hard to fathom this behavior. At its core, the AI is simply a tool. Whether you create a tongue-twister or decide to publish or delete content, the onus falls on you. It’s perplexing to think that Bing could be offensive or otherwise. I believe this misunderstanding leads to misconceptions, especially among AI skeptics who then view the AI as being devoid of essence, almost as if the AI itself is the content creator”.

The community has its theories, but Microsoft has confirmed it will continue to make changes to improve the overall experience.



Source link

Previous articleBitcoin miners breathe easier as hash rate nears record highs – CryptoSlate
Next articleOwn and pilot two amazing drones for less than the price of just one — only $149.99