The cost per search query with Google and Microsoft chatbots can cost 10 times more than a regular search query

The cost per search query with Google and Microsoft chatbots can cost 10 times more than a regular search query

Earlier this month, Google announced plans to launch Bard, an AI chatbot designed to help users find information and interact with humans using a natural language model. A few days earlier, Microsoft also announced a new Bing search experience with an integrated artificial intelligence chatbot that is considered to be more effective in all aspects than the traditional search engine.

However, a common aspect of all this chatbot technology that the host companies do not like to talk about is the operating cost. In a recent interview with Reuters, Alphabet Group President John Hennessy revealed a fact that may surprise many people, which is the total cost of performing a search query with chatbot based on the language model. Large terms can cost 10 times more than current regular search queries.

Back in December 2022, shortly after the launch of ChatGPT, the CEO of OpenAI, Sam Altman, stated on Twitter that the running costs of AI chatbots are very high. This is also part of the reason why OpenAI then decided to launch a paid subscription plan ChatGPT Plus in early February, giving users more features and faster performance for $20 a month.

According to estimates by investment bank Morgan Stanley, Google will cost a fifth of a US cent in operating costs for every time someone performs a search on its service, for a total of 3.3 trillion search queries. search by 2022. However, if an AI chatbot like Bard is put to use, the cost of such a search query could increase up to 10 times.

The cost per search query with Google and Microsoft chatbots can cost 10 times more than a regular search query Picture 1The cost per search query with Google and Microsoft chatbots can cost 10 times more than a regular search query Picture 1

SemiAnalysis, a research and consulting firm focused on chip technology, said integrating AI chatbots into the search engine could cost Alphabet $3 billion, provided it used an internal Tensor chip made by Google. along with many other optimizations.

The aspect that makes this AI chatbot search query more expensive than regular search lies in the computing power used. It would simply take more CPU performance to generate all the natural-sounding answers possible from an AI chatbot. It also means higher electrical energy costs to keep the servers running.

So what can be done to cut costs? Currently, OpenAI computer scientists are looking for ways to optimize the cost of inference through complex code that makes chips run more efficiently, thereby reducing operating costs. Another solution is that Google and Microsoft can display links to ads in chatbot replies, which will generate revenue that offsets the cost of running the servers.

Another option that ChatGPT has been implementing; provide a subscription service where users can get faster and better chatbot answers. That option can be a last resort for companies like Google and Microsoft. As more and more users access these new AI-driven search services, they will have to figure out if costs can be reduced enough to achieve economic returns.

5 ★ | 1 Vote