OpenAI CEO Sam Altman Dismisses Claims About ChatGPT’s Water and Energy Use
OpenAI’s Leader Addresses Environmental Criticism
Sam Altman, chief executive of OpenAI, recently spoke about growing public concern over the environmental impact of artificial‑intelligence systems. In a public interview, Altman directly challenged widely circulated figures that suggest a single ChatGPT query uses a large amount of water or electricity.
Altman described the claim that a query consumes "17 gallons of water" as "completely untrue" and "totally insane," stressing that there is no direct connection to reality. He also refuted the notion that a single request draws the equivalent of "1.5 iPhone battery charges," stating there is “no way it’s anything close to that much.”
While dismissing these specific metrics, Altman acknowledged that AI’s overall energy consumption is a legitimate concern. He noted that data‑center cooling historically relied on evaporative methods, which required significant water use, but modern facilities have largely moved away from that approach. He argued that the real issue lies in the total energy demand of AI technologies worldwide, not the per‑query usage.
Altman called for a rapid transition to low‑carbon power sources, saying the world must “move towards nuclear or wind and solar very quickly.” He highlighted that there is currently no legal requirement for technology companies to disclose the exact amounts of energy and water they consume, leaving independent scientists to estimate the impact on their own.
He also referenced broader industry trends, noting that data‑center operations have been linked to rising electricity prices. In his view, discussions that compare AI training energy to the cost of a human inference are “unfair,” particularly when they focus on the energy required to train a model versus the energy a human uses to answer a question.
Altman drew a parallel between AI and human learning, pointing out that humans spend “like 20 years of life and all of the food you eat during that time” to become knowledgeable. He suggested that when measured against this benchmark, AI may already have achieved comparable energy efficiency for inference tasks.
Overall, Altman’s remarks aim to shift the conversation from sensational per‑query statistics to a more nuanced understanding of AI’s total energy footprint and the importance of clean energy adoption across the sector.
Used: News Factory APP - news discovery and automation - ChatGPT for Business