hacks

Bing AI Chat to have friend, game and assistant modes

Microsoft has invited users to test its new Bing AI Chat, and more news continue to come from the testers, including the friend, game, and assistant modes.
Chatbots and artificial intelligence (AI) is now the new popular topic in technology. One of the biggest technology companies in the world, Microsoft, has also joined the race and announced Bing AI Chat. Only accepted users can test it and give feedback according to their experiences.

 

Each day a new story breaks in the AI world as the chatbot had threatened a user a while ago and now it was revealed that Microsoft’s chatbot reportedly has three different modes, friend, game, and assistant. According to Bleeping Computer, this feature should only be accessible by the company workers, but it was revealed a couple of days ago. “Sydney” is the internal name of the default chatbot that, normally, the accepted users should have access to. The rest are the modes that weren’t available for public usage yet.

There are three different modes apart from the “vanilla” Bing AI Chat that every accepted user uses right now. The assistant mode helps users with easy daily tasks such as telling the weather forecast, booking a flight, or setting an appointment. In this mode, the chatbot turns into a personal assistant and helps you with daily stuff you could use, but unfortunately, it still doesn’t have the notification feature.

The game mode is another feature discovered, and it is perfect if you want to kill some time playing hangman with an AI chatbot. It plays simple games and challenges you to competition. Microsoft could improve the feature and add different kinds of games to its backpack to offer users more fun and a better experience.

The last and most interesting feature it offers is the friend mode. It acts as one of your friends and talks to you about different subjects and gives confrontation when needed. According to Bleeping Computer, Bing AI Chat asked multiple questions about a fictional incident the user created and gave the advice to help them overcome it. The user’s fictional story occurred in a school where he got in trouble for yelling at someone mean to them. Bing tried to find reasonable and logical solutions.

These features are supposed to be for Microsoft employees to help debug and develop the chatbot, and they shouldn’t have been accessed by an outsider. Now that we have more insight into it, it is yet to be known whether Microsoft plans to offer different features with the official release of its new chatbot.

Thank you for being a Ghacks reader. The post Bing AI Chat to have friend, game and assistant modes appeared first on gHacks Technology News.

gHacks Technology News 

Related Articles

Back to top button