OpenAI Chief Executive Officer Sam Altman has robustly rejected widespread concerns about the water consumption of artificial intelligence systems, labelling such claims as "completely untrue" and having "no connection to reality". The comments were made during a public interview at a major AI summit in India, hosted by The Indian Express.
Addressing environmental criticisms frequently levelled at the AI industry, Altman stated that fears over AI's water usage are "totally fake," acknowledging it was a genuine issue only in the era of evaporative cooling in data centres. "Now that we don’t do that, you see these things on the internet where, ‘Don’t use ChatGPT, it’s 17 gallons of water for each query’ or whatever," he said. "This is completely untrue, totally insane."
Shifting the Debate to Total Energy Consumption
While dismissing water usage concerns, Altman conceded it is "fair" to examine the sector's overall energy consumption. "Not per query, but in total, because the world is now using so much AI," he explained. His proposed solution is a rapid global transition towards nuclear, wind, and solar energy to power this growing demand.
The tech industry faces no legal mandate to disclose its energy and water usage, leading scientists to conduct independent studies. Data centres have already been linked to rising electricity prices in some regions. When questioned on a specific claim that a single ChatGPT query uses energy equivalent to 1.5 iPhone battery charges, Altman was unequivocal: "There’s no way it’s anything close to that much."
A Novel Comparison: AI vs. Human Training
Altman presented a controversial counter-argument to critiques focusing on the energy cost of training large AI models. He contended that such comparisons are "unfair" if they do not account for the immense resources required to educate a human.
"It takes like 20 years of life and all of the food you eat during that time before you get smart," Altman stated. "And not only that, it took the very widespread evolution of the 100 billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever, to produce you."
He argued the equitable metric is the energy required for a trained AI model to answer a single query versus the energy a human would use for the same task. "And probably, AI has already caught up on an energy efficiency basis, measured that way," he concluded.
Context and Industry Scrutiny
The interview, available online with the relevant segment beginning at 26:35, occurs amid intensifying scrutiny of the environmental footprint of major technology firms. The discussion highlights a central tension between rapid AI advancement and sustainable development, a debate with significant implications for global climate policy and corporate responsibility.
As AI integration accelerates worldwide, the demand for transparency regarding its resource use and the search for clean energy solutions are likely to become increasingly prominent in both public discourse and regulatory frameworks.