Sam Altman Defends AI’s Resource Use, Contrasts with Human Energy Demands
OpenAI CEO Sam Altman has pushed back against growing concerns regarding the significant resource consumption, particularly water and electricity, associated with artificial intelligence. Speaking at the India AI Impact Summit, Altman addressed claims about ChatGPT’s water usage, labelling them as “completely untrue” and “totally insane.” He explained that data centres powering advanced AI models like ChatGPT have largely transitioned away from water-intensive evaporative cooling methods, opting for more efficient technologies to manage overheating.
When the discussion shifted to the substantial electricity demands of AI, Altman acknowledged the validity of raising the issue. He stated that a rapid move towards renewable energy sources, including nuclear, wind, and solar power, is essential. However, he cautioned against direct comparisons between the energy needs of AI and humans, arguing that such comparisons fail to account for the full lifecycle of human development.
The Long Game of Human Evolution and Learning
Altman elaborated on this point by highlighting the immense energy investment required to “train” a human. He humorously noted that it takes approximately 20 years of life, including all the food consumed during that period, before an individual becomes fully functional and intelligent.
He further emphasised the long-term evolutionary context, pointing out that modern humans are the product of hundreds of thousands of years of ancestral development. This evolutionary journey involved the collective learning and adaptation of billions of individuals who survived predators and developed complex skills like science. Altman argued that this extensive historical and biological “training” must be considered when evaluating the energy efficiency of AI.
He proposed a more equitable comparison: pitting the energy a human expends to answer a query against the energy an AI uses after its initial training period. By this metric, Altman suggested that AI has likely already achieved comparable or superior energy efficiency.
Quantifying AI’s Energy Footprint
In a blog post from June 2025, Altman provided an estimate that each ChatGPT query consumes approximately 0.34 watt-hours of electricity. This is roughly equivalent to the energy an oven uses in about one second. It’s important to note that this figure predates the release of OpenAI’s more advanced GPT-5 model and subsequent updates. The energy consumption can also vary significantly depending on the complexity of the task, with image generation, for example, requiring more power than simple text-based queries.
Expert Projections on AI’s Growing Resource Demands
Despite Altman’s reassurances, experts have issued stark warnings about the projected increase in AI’s cumulative power and water consumption over the next two decades. A January report by water technology company Xylem and market research firm Global Water Intelligence forecasts a substantial rise in AI’s overall water usage.
- Projected Water Usage Growth:
- AI’s cumulative water consumption is expected to grow by approximately 130% by 2050.
- This translates to an increase of roughly 30 trillion litres (7.9 trillion gallons) of water by 2050.
The report also highlights the indirect impact of AI on water resources through electricity generation for data centres:
- Water for Power Generation:
- Rising electricity demands from AI are predicted to increase the water used for data centre power generation by about 18% over the same period.
- This would reach approximately 22.3 trillion litres (5.8 trillion gallons) annually by 2050.
Furthermore, the increasing sophistication of chips used in data centres will also drive up water requirements during the manufacturing process:
- Water for Chip Manufacturing:
- The demand for water in manufacturing these complex chips is expected to skyrocket by 600%.
- This would mean an annual requirement of 29.3 trillion litres (7.7 trillion gallons), a significant jump from the current approximately 4.1 trillion litres (1.8 trillion gallons).
Data Centre Cooling Technologies
While OpenAI has moved away from traditional evaporative cooling, a significant portion of global data centres still rely on this method. The Xylem and Global Water Intelligence report indicates that 56% of all data centres worldwide employ some form of evaporative cooling.
In contrast, OpenAI’s planned 800-acre data centre complex in Abilene, Texas, is reported to be utilising a more efficient, closed-loop water system designed for continuous recirculation. This facility is expected to draw an initial 8 million gallons of water from the city of Abilene to fill its cooling infrastructure.





