How much electricity and water do AI data centers consume? The truth behind the controversies.
We are entering a period of explosive growth in the artificial intelligence era. Alongside these new achievements, a pressing issue arises concerning the enormous energy consumption of AI data centers. Are these facilities consuming all of our drinking water? How is this technology impacting the environment? And some even ask, in a rather pessimistic way: Does AI endanger humanity?
Recently, OpenAI CEO Sam Altman faced criticism for claiming that such concerns — particularly regarding AI's water consumption — were "completely untrue." This statement stemmed from a Q&A session hosted by The Indian Express . Around the 26th minute of the interview, Altman was asked to respond to criticisms leveled against AI, including the amount of natural resources required to run large language models like ChatGPT.
The CEO responded to criticisms that AI 'consumes too much water' as 'completely wrong'. According to him, while the use of large amounts of water might have been true in the past, OpenAI no longer uses evaporative cooling systems. Therefore, estimates that each user query to the chatbot consumes approximately 17 gallons of water are no longer accurate.
He stated bluntly: " That's completely untrue and quite absurd, with no relevance to reality. "
However, Altman also acknowledged that concerns about AI's power consumption are valid. But he argued that this issue should be assessed on a holistic scale, not individually for each query, because some tasks—like video creation—consume far more resources than simple text conversations. Of course, Altman still stressed that the technology industry needs to quickly transition to nuclear, wind, or solar energy sources in the future.
Will AI data centers put pressure on land and power systems?
Altman's remarks come amid ongoing and heated debates about data centers and their energy consumption.
Last year, CNET reporter Corin Cesaric analyzed the energy demands of AI and found that the cost of training and operating systems like ChatGPT, Gemini, or Claude is "enormous."
According to the International Energy Agency (IEA), the United States alone accounted for 45% of the global electricity consumption of data centers in 2024.
As for water, Google's two data centers in Council Bluffs, Iowa, alone used 1.4 billion gallons of water in 2024 — equivalent to enough water to fill approximately 28 million standard bathtubs. Google currently has a total of 29 data centers worldwide.
Meanwhile, Meta's data centers also consumed approximately 1.39 billion gallons of water in 2023.
There are currently no up-to-date data for 2025 from OpenAI, Meta, or Google regarding resource consumption. However, many experts believe that as AI generation becomes more widespread, the amount of electricity and water used by data centers will certainly increase.
How do AI data centers use water?
ChatGPT currently has nearly 1 billion weekly users, and OpenAI estimates that the system processes approximately 2.5 billion requests per day. This massive amount of data causes the powerful computers used to train the model and process queries to generate a significant amount of heat.
Think of it simply like when your phone or laptop heats up while running demanding tasks. If the servers in a data center overheat, they can slow down or even malfunction. This is where water comes in handy to cool the system.
Traditionally, AI data centers utilize water in two main ways: evaporative cooling and closed-loop recirculating cooling systems.
Evaporative cooling is a ventilation technique that utilizes the process of water evaporating into steam, thereby absorbing heat and reducing the ambient temperature. In contrast, closed-loop cooling systems reuse water multiple times for heat dissipation without evaporation, resulting in greater resource efficiency.
In an announcement in January, OpenAI stated that it was prioritizing closed-loop cooling systems, or systems that consume less water, in order to reduce water usage. This somewhat reinforces Altman's claim that the 17 gallons of water per query figure may no longer be accurate.
However, OpenAI has not yet released specific figures on water consumption in 2025.
The company also stated that it is gradually phasing out more expensive evaporative cooling systems. However, according to a January 2026 report by water technology company Xylem and research firm Global Water Intelligence, 56% of current data centers still use evaporative cooling in some form.
This study also predicts that AI's water consumption could increase by nearly 130% by 2050.
How much electricity does AI consume?
Operating AI and massive data centers requires a huge amount of electricity.
AI-generated chatbots typically consume more energy than traditional search engines like Google or Bing. One estimate suggests that a chatbot query may require 10 times the amount of electricity as a Google search.
On average, a text query consumes about 0.24 to 3 watt-hours of electricity. However, tasks such as creating images or videos using AI can consume significantly more power.
In a report published in August 2025, Google revealed that an average text query on Gemini consumes 0.24 watt-hours of electricity, emits 0.03 grams of CO₂ equivalent, and uses approximately 0.26 ml of water — equivalent to about 5 drops of water.
Google says this amount of electricity is equivalent to turning on a microwave for about 9 seconds.
Is solar energy a viable solution?
Although AI models require 24/7 power, solar energy is still considered a viable and scalable option for powering AI data centers.
In October 2025, OpenAI announced a multi-billion dollar project to research new energy production methods, combining solar power and battery storage systems.
By 2025, major tech companies like Meta, Microsoft, Google, and Amazon had also expanded their use of solar energy across the United States.
However, renewable energy sources such as solar or wind power are still only a part of the electricity supply for data centers. The majority of these facilities still rely on the national grid, which is still primarily powered by fossil fuels such as natural gas.
The future of the debate about AI and the environment.
Debates about AI's water consumption are gradually shifting from unverified claims to more rigorous analysis and monitoring.
Many communities and policymakers are now demanding that tech companies be more transparent about their resource use and adopt sustainability measures, to ensure that the AI boom does not put excessive pressure on local water and power systems.
As AI continues to develop, the debate about how to balance technological innovation and environmental responsibility will undoubtedly heat up even more.
You should read it
- ★ The reasons for Data Center crash
- ★ The impressive images of the data center are located at a depth of 35m on the seabed of Microsoft
- ★ Types of data center design
- ★ The wave of opposition to AI data centers is heating up in the US - what are the reasons behind it?
- ★ Microsoft CEO Satya Nadella will bring the data center to the sea