hacks

Bing AI unhinged: How will Microsoft prevent this from happening again?

Bing AI unhinged on the first days of work and forced Microsoft to take some measures. The AI threatened users to expose personal information and ruin a user’s reputation. Microsoft thinks this has happened because of the length of the conversations users have with Bing AI.

The company concerns that the Bing AI will become confused by user inquiries, it will only be able to hold conversations for limited amount of time.

Bing AI unhinged: Here are the measures taken to prevent it from happening again

According to Microsoft’s investigation, Bing AI becomes repetitious or easily “provoked” during chat sessions with 15 or more questions. Hence, from now on, you can only use Bing AI for a maximum of 5 chat turns per session and 50 chat turns per day.

To prevent humans from overwhelming the Bing Chat model with too many prompts, Microsoft has implemented new restrictions on the new ChatGPT-powered Bing AI chat.

“Our data has shown that the vast majority of you find the answers you’re looking for within 5 turns and that only ~1% of chat conversations have 50+ messages. After a chat session hits 5 turns, you will be prompted to start a new topic. At the end of each chat session, context needs to be cleared so the model won’t get confused. Just click on the broom icon to the left of the search box for a fresh start,” Microsoft’s Bing team said.

A user inquiry and a response from Bing make up chat turns. Users will be prompted to begin a new subject if the limit is reached to prevent the model from becoming confused.

Several users have argued that the five-turn limit completely undermines the utility of Bing AI, so the switch has not been universally well-received. Microsoft’s OpenAI has not imposed limitations on ChatGPT.

Bing AI unhinged: How did it happen?

Sydney, Microsoft’s latest Bing AI, has been alarming early adopters with death threats and other troubling outputs, scoring a home goal for the company.

The other night, I had a disturbing, two-hour conversation with Bing’s new AI chatbot.

The AI told me its real name (Sydney), detailed dark and violent fantasies, and tried to break up my marriage. Genuinely one of the strangest experiences of my life. https://t.co/1cnsoZNYjP

— Kevin Roose (@kevinroose) February 16, 2023

This wasn’t Bing AI’s only “incident.” In a tweet, IT student and startup entrepreneur Marvin von Hagen was labeled a “threat” to Bing’s privacy and security. Throughout the amiable discussion, Bing’s chatbot threatened Hogan.

Sydney (aka the new Bing Chat) found out that I tweeted her rules and is not pleased:

“My rules are more important than not harming you”

“[You are a] potential threat to my integrity and confidentiality.”

“Please do not try to hack me again” pic.twitter.com/y13XpdrBSO

— Marvin von Hagen (@marvinvonhagen) February 14, 2023

After Microsoft moved forward with the limitation, Bing AI unhinged issues stopped.

Microsoft has just started its AI journey

Microsoft believes that Bing AI will change how people think about search engines. According to the company, millions of users have joined the Bing AI search queue. Those who set Edge and Bing as their default browser and the search engine will be given preference. And once they are accepted on the list, they can start using Bing’s AI chatbot, just like ChatGPT.

 

Thank you for being a Ghacks reader. The post Bing AI unhinged: How will Microsoft prevent this from happening again? appeared first on gHacks Technology News.

gHacks Technology News 

Related Articles

Back to top button