Bing Chat AI can now interpret the content of images
Bing Chat, ChatGPT and other similar general AI tools on the market currently mainly focus on understanding text content in natural human language and providing answers. However, understanding visual content is also extremely useful, and that is one of the focus Microsoft is currently implementing for its Bing Chat AI model.
In a recent blog post, Microsoft announced a very interesting new feature for Bing Chat, called Visual Search. With this feature, people can upload an image or select an image available on the web, and Bing will attempt to analyze and interpret the content contained in that image, and apply the data obtained in the response. Microsoft's demo video shows a person uploading a hand-drawn mockup of a web form and asking Bing to generate the HTML and CSS code to make it work.
About Visual Search, Microsoft said:
"Whether you're traveling to a new city and asking about the architecture of a particular building, or are at home trying to come up with lunch ideas based on what's in your fridge, simply upload an image to Bing Chat, and use it to tap into the rich knowledge of the AI algorithm to get the right answer."
In fact, since 2017 tools like Google Lens have been able to identify people, animals, plants, landmarks, and other objects in images, or before that, Google Goggles in 2010. To get a competitive edge over the competition, Microsoft is using GPT-4's image detection features. This is inherently the same language model used by the premium version of ChatGPT, known for its high accuracy.
Early real-world testing shows that asking Bing to describe an image often results in a much more detailed response than what users get from Google Lens. For example, when a user uploads a photo of a dog, Bing Chat's response is: "This is a photo of a black and tan dog sitting on a brown fur rug. The dog has a red collar with a silver tag. The dog is looking up at the camera with his ears raised. The background is a white couch with blue-and-white pillows. The photo was taken from a high angle." In addition, the tool also correctly interprets the message that the image has been uploaded in italics. It can be seen that the detail of the content that the AI can interpret is very impressive.
You should read it
- Microsoft Bing Chat has started using GPT-4, OpenAI's super artificial intelligence
- Bing Chat will support many different chat modes, promising a variety of experiences
- What is super artificial intelligence AGI that makes scientists scared?
- The line between ChatGPT and Bing Chat is getting blurred
- How to quickly turn on / off Bing Chat AI in Windows 11 Taskbar search
- Watching pictures painted by artificial intelligence, everyone thinks that is the work of a true artist
- Artificial intelligence learns to create another artificial intelligence, replacing people in the future
- 6 steps to start learning artificial intelligence programming (AI)
- HuggingChat or Bing Chat is a better ChatGPT alternative?
- What happens if aliens are artificial intelligence?
- [Infographic] Benefits and hazards from Artificial Intelligence
- How can the AI see us behind the walls?
May be interested
The field of switchboard operators suffered the sweep of the 'AI tsunami'
What is Llama 2? How to use Llama 2?
Apple threatens to stop iMessage and FaceTime services in the UK
Galaxy Ring: Samsung's smart ring collects user health data and sends it to smartphones
OpenAI announces ChatGPT app for Android
Twitter is about to limit the number of messages that can be sent from unverified accounts