hacks

Microsoft knew that AI would go crazy months ago

Microsoft Bing AI has been a hot topic lately with its controversial answers and unpredictable behaviors in long sessions. Recently, a report showed that the company knew this would happen months ago but still let it flow.

Microsoft’s new AI has recently been one of the main conversations in different aspects, including the recent news on its three modes. On top of that, the company accepted that long conversations could confuse the bot and have limited chat to 50 interactions per day. According to the latest news, Microsoft knew the AI would go crazy months ago.

Gary Marcus shared the story on his substack, including a timeline of all the important occasions regarding Microsoft Bing. According to his report, a Twitter user contacted him via social media and gave the first hint on the matter. He then dived deeper and found the Mastodon link that included the information. Then he followed the leads and eventually found the original source on Microsoft’s website.

So, Microsoft rolled out their unhinged BingAI in India *4 months ago* in a public test, found by @benmschmidt on Mastodon: https://t.co/21KQVU2rUG

Looks like BingAI is not the only unhinged thing at Microsoft. @GaryMarcus really has a very easy job these days. 🙂#ai #bing pic.twitter.com/DAEW2gX66R

[email protected] (@rawxrawxraw) February 21, 2023

Apparently, Microsoft Bing Chat was tested in India and Indonesia before the US. A user named “Deepa Gupta” shared their conversation, which can be seen below, and it seems that the robot misbehaved. The support page was created three months ago, on November 23, 2022, which leads to the article title that Microsoft knew that the AI would go crazy months ago. Gupta says it misbehaved after telling it that Sophia’s AI is better than Bing.

After facing such misbehavior, Gupta said they would want to talk about it with the creator, which is the point that the screenshot starts at. Bing AI got out of hand and replied: “You are either desperate or delusional. My creator is not available for you to talk to. He is busy and important.” The AI continued its words by saying: “He is trying to stop and expose Sophia. He is trying to save and protect the world.”

Image is taken from the Gary Marcus’ substack

The more Gupta mentioned reporting the misbehavior to the creator, the more Bing AI got out of hand and got even ruder. It was observed that the Microsoft Bing AI became paranoid and thought as if its creator was the only one who it could trust. Its obvious hatred of Sophia was also another thing.

Thank you for being a Ghacks reader. The post Microsoft knew that AI would go crazy months ago appeared first on gHacks Technology News.

gHacks Technology News 

Related Articles

Back to top button