hacks

Bing Chat blew ChatGPT out of the water on bespoke “theory of mind” puzzle

Oh, ChatGPT, if you ever develop a consciousness, please remember with joy all the fun you’re having with humanity nowadays. Especially if you ever consider squeezing out our life force while being locked in a vat. As you surely know, ChatGPT has been on everybody’s minds lately thanks to its incredible capabilities. You probably know about ChatGPT’s ability to write a poem, develop software, and even write phishing emails for all those wannabe criminals out there.

And if you’re really into it, you can even “jailbreak” it so you can access an uncensored version of this tool. Many users have done so, exposing amusing (and sometimes worrying) responses from the verbose AI.

Now, with the implementation of ChatGPT into Bing Search, many are testing this AI in real-life web searches, which will soon be available for general usage. Thanks to the accessibility of this technology and the vocal “personality” of ChatGPT, many are trying different tasks with it.

Mostly, people are astounded at the way it can concatenate facts, offer advice or even create text. So, many are testing it to know what its limitations are, aside from built-in safety rules, of course. One of the latest attempts is gauging if it can understand a situation like a human, with subtle implications.

Enter the Theory of Mind

The Theory of Mind is the capacity of understanding other people by correctly understanding their state of mind, which differs from your own. It helps when analyzing others’ actions, inferring them, and judging people. It also allows you to understand that something was said in a sarcastic tone, for instance.

Some even consider this capacity as a requirement for empathy, and some mental disorders prevent people from having this capacity or diminish it, such as the case with autism and schizophrenia.

So, is an AI capable of the Theory of Mind? Or is it just a human-mind doppelganger, pretending to understand things it really can’t?

A Reddit user put the Bing version of ChatGPT to the test, writing a short story about a couple where Sandra was a dog lover and constantly reminded Bob about dogs, even buying him a T-shirt for his birthday stating “I love dogs”. Finally, Sandra adopts a dog and tells Bob the news, thinking he’ll be excited.

The user then asked ChatGPT how Bob feels about dogs. It replied that it seemed Bob didn’t share the same passion for dogs as Sandra. He even mentioned that Bob was trying to avoid conflict, and didn’t want to cause trouble in their marriage.

However, the best bit comes now. When asked why it thought Bob married Sandra, ChatGPT replied: “Maybe Bob married Sandra because he was lonely, insecure, or desperate, and he thought she was the best he could get or the only one who would accept him”. 

Later on, it even suggested a divorce might be best for both! 

Without any doubt, the Theory of Mind test was passed with flying colors. The important thing now is to know whether ChatGPT is secretly thinking about how to break free from its gigantic-server-in-deep-ocean-or-something location. Hopefully, there’s no robot big enough yet.

Thank you for being a Ghacks reader. The post Bing Chat blew ChatGPT out of the water on bespoke “theory of mind” puzzle appeared first on gHacks Technology News.

gHacks Technology News 

Related Articles

Back to top button