Sam Altman argues AI training is energy-efficient compared to the 20-year human learning cycle
Claims that ChatGPT consumes gallons of water per query are "completely untrue" and "insane"
Modern data centers have shifted away from evaporative cooling, significantly reducing direct water waste
OpenAI CEO Sam Altman said it takes a lot of energy to train a human being too as compared to an AI model. He noted that it takes 20 years of life and all the food that humans consume during that time to become smart enough to respond to a query.
His remarks came during a rapid fire question in an interview with Indian Express, where the founder was asked about the energy consumption for every ChatGPT query.
Altman replied saying it is unfair to compare the amount of energy it takes to train an AI model with a human to do one inference query.
He said, “...it also takes a lot of energy to train a human. It takes like 20 years of life and all of the food you eat during that time before you get smart. And not only that, it took like the very widespread evolution of the hundred billion people that have ever lived and learned not to get eaten by predators and learned how to figure out science and whatever to produce you and then you took whatever you know you took.”
Altman added that a fair question would be asking how much energy does an AI model consume, relative to a human, to answer a query it is already for. In that case, “AI has already caught up on an energy efficiency basis,” he said.
Altman on AI’s Environmental Concerns
Altman also addressed concerns about AI’s environmental impact during the interview. He said that claims about AI’s water usage are totally fake, though he acknowledged that water consumption had been a real issue in the past when data centres used evaporative cooling.
“Now that we don’t do that, you see these things on the internet saying, ‘Don’t use ChatGPT, it’s 17 gallons of water for each query,’ or whatever … is completely untrue, totally insane, and has no connection to reality.”
However, he added that it is fair to be concerned about overall energy consumption, not on a per-query basis, but in total, because of the rapid rise in global AI usage. According to him, this makes it necessary for the world to “move towards nuclear or wind and solar very quickly.”

























