The cost per search query with Google and Microsoft chatbots can cost 10 times more than a regular search query
Earlier this month, Google announced plans to launch Bard, an AI chatbot designed to help users find information and interact with humans using a natural language model. A few days earlier, Microsoft also announced a new Bing search experience with an integrated artificial intelligence chatbot that is considered to be more effective in all aspects than the traditional search engine.
However, a common aspect of all this chatbot technology that the host companies do not like to talk about is the operating cost. In a recent interview with Reuters, Alphabet Group President John Hennessy revealed a fact that may surprise many people, which is the total cost of performing a search query with chatbot based on the language model. Large terms can cost 10 times more than current regular search queries.
Back in December 2022, shortly after the launch of ChatGPT, the CEO of OpenAI, Sam Altman, stated on Twitter that the running costs of AI chatbots are very high. This is also part of the reason why OpenAI then decided to launch a paid subscription plan ChatGPT Plus in early February, giving users more features and faster performance for $20 a month.
According to estimates by investment bank Morgan Stanley, Google will cost a fifth of a US cent in operating costs for every time someone performs a search on its service, for a total of 3.3 trillion search queries. search by 2022. However, if an AI chatbot like Bard is put to use, the cost of such a search query could increase up to 10 times.
SemiAnalysis, a research and consulting firm focused on chip technology, said integrating AI chatbots into the search engine could cost Alphabet $3 billion, provided it used an internal Tensor chip made by Google. along with many other optimizations.
The aspect that makes this AI chatbot search query more expensive than regular search lies in the computing power used. It would simply take more CPU performance to generate all the natural-sounding answers possible from an AI chatbot. It also means higher electrical energy costs to keep the servers running.
So what can be done to cut costs? Currently, OpenAI computer scientists are looking for ways to optimize the cost of inference through complex code that makes chips run more efficiently, thereby reducing operating costs. Another solution is that Google and Microsoft can display links to ads in chatbot replies, which will generate revenue that offsets the cost of running the servers.
Another option that ChatGPT has been implementing; provide a subscription service where users can get faster and better chatbot answers. That option can be a last resort for companies like Google and Microsoft. As more and more users access these new AI-driven search services, they will have to figure out if costs can be reduced enough to achieve economic returns.
You should read it
- Is ChatGPT, Microsoft Bing AI or Google Bard the best AI chatbot?
- [Infographic] The trend of Chatbot will explode in the future
- 9 key differences between ChatGPT and Bing's AI Chatbot
- Is ChatGPT Plus or Perplexity the better AI chatbot?
- Google's AI chatbot for medical support has been tested in hospitals
- Baidu launches Ernie Bot, ChatGPT's rival chatbot
- 4 ways AI Claude chatbot outperforms ChatGPT
- Samsung may launch ChatGPT-competitive AI chatbot later this year
- 5 things users should avoid asking AI chatbots
- The line between ChatGPT and Bing Chat is getting blurred
- Anthropic Launches Claude 2: New Competitor for ChatGPT and Bard
- 8 key factors to consider when testing AI chatbot accuracy
Maybe you are interested
How to use the file search command on Windows, find saved files
How to delete search history on CH Play, hide searched content
How to turn off search personalization on Google
Instructions for creating ChatGPT Search shortcuts on iPhone
How to use Vlookup function in Excel to search and reference
Search Race: Is ChatGPT Threatening Google's Throne?