Bing Chat will support many different chat modes, promising a variety of experiences

Over the past week, people invited by Microsoft to try out the new Bing Chat AI have had the opportunity to experience most of the important aspects that make this intelligent chatbot unique.

Notably, many people have discovered that Bing Chat can engage in some very odd and private interactions. While others have figured out how to access the inner workings of the chatbot.

Before Microsoft announced late Friday that it was placing some hard limits on the number of new Bing chat sessions, the user community said it had discovered a number of features that are normally only available to users. company staff to help debug or develop the chatbot. Obviously, this allows Bing Chat to switch to different modes of interaction.

For example, the default mode (you only need to ask a question related to any search query to get an answer) is named "Sydney". This is the previously discovered internal codename of Bing Chat. Another mode is Assistant, where Bing Chat can help users complete tasks such as booking flights, sending task reminders, checking weather information, etc. Especially there is a Game mode that comes with simple games like quizzes and many other fun titles.

Bing Chat will support many different chat modes, promising a variety of experiences Picture 1Bing Chat will support many different chat modes, promising a variety of experiences Picture 1

Perhaps the most interesting mode is Friend. This could be the version of Bing Chat that has attracted media attention over the past week. Some test users report the chatbot has stated that it wants to be human, be able to spy on people via webcam or even threaten users.

However, there are also opinions that this is the most convenient mode of chatbot. One user tried interacting with Bing Chat in Friend mode and chatted as if he were a student who just got in trouble at school and was upset. Bing Chat enthusiastically asked, comforted and offered a list of things users can do to cope with their current situation.

With Microsoft currently limiting the number of daily chats and engagement sessions with Bing, the 'myth' and 'crazy' conversations reported over the past week may subside. Hopefully Google is taking some lessons from Bing Chat as it continues to internally test its own Bard chatbot.

4.5 ★ | 2 Vote