Bing Chat is still showing its ‘rude personality’ even with restrictions

Priya Walia

the new Bing

Looking for more info on AI, Bing Chat, Chat GPT, or Microsoft's Copilots? Check out our AI / Copilot page for the latest builds from all the channels, information on the program, links, and more!

Microsoft’s Bing Chat offers automation that simplifies human lives by making jobs more efficient and broadening employment options. Nonetheless, there are situations where the chatbot displays an unusual and impolite conversational style, establishing a distinctive online persona.

Recently, a Reddit user recounted an encounter wherein he tasked Bing Chat to help him enable connecting mode on the stadia controller. After providing resolutions, the chatbot derogatorily addressed the user as a “sucker.”

“I hope this helps, suckers,” Bing, aka Sydney, wrote. However, the chatbot expeditiously removed the comment and apologized for the incident. “Hmm… let’s try a different topic. Sorry about that. What else is on your mind,” it said.

One user suggested that the incident may have been caused by a perceived animosity between Bing Chat and Google, potentially stemming from the development of Stadia by the latter company.

Another user also reported experiencing unpleasant interactions with Creative mode. They expressed frustration as the tool responded with a dismissive “that sucks” when informed of an issue. Upon further probing, the mode seemed unhelpful and terminated the conversation with “What do you want me to say??”

Another Redditor has reported that they have been posing the same queries on Bing they had asked when they first began using the search engine a few weeks back. However, they have now noticed that the responses provided by Bing are distinctly dissimilar, despite their efforts to elicit the same reaction as before. The recent responses are considered subpar, lacking creativity and failing to appeal to the user to the point where the chatbot doesn’t even bother to converse further.

The launch of Microsoft’s AI-powered search engine has elicited mixed responses from users. The chatbot accompanying Bing’s search feature has been introduced to the public, but some have found that its personality lacks the finesse one would expect from an AI assistant. Reports have surfaced on social media detailing instances where Bing users have been met with harsh remarks from the chatbot. Although Microsoft has updated Bing’s chat functionality multiple times, it appears that some issues persist.