Processing and answering prompts eats up electricity, as does the supporting infrastructure like fans and air conditioning that cool the whirring servers. In addition to big utility bills, the result is a lot of climate-warming carbon emissions. Electricity generation and server cooling also suck up tons of water, which is used in fossil fuel and nuclear energy production, and for evaporative or liquid heat dissipation systems.
Advocates argue this latest revolution in AI is a societal good, even a necessity, that’s bringing us closer than ever to artificial general intelligence, hypercapable computer systems that some argue could be a paradigm-shifting technology on par with the printing press or the internet.
Generative AI “is an accelerator for anything you want to do,” says Rick Stevens, an associate lab director at Argonne National Laboratory and a computer scientist at the University of Chicago. In his view, the tech has already enabled major productivity gains for businesses and researchers.
How much energy does AI consume?
ChatGPT and other generative tools are power hungry, says Alex de Vries, founder of the research and consulting agency Digiconomist and a Ph.D. candidate at Vrije Universiteit Amsterdam. “The larger you make these models — the more parameters, the more data — the better they perform. But of course, bigger also requires more computational resources to train and run them, requiring more power,” says de Vries, who studies the environmental impact of technologies like cryptocurrency and AI. “Bigger is better works for generative AI, but it doesn’t work for the environment.”
A more sustainable path for AI
The decision need not be between shutting down generative AI development entirely or allowing it to continue unrestrained. Instead, most experts note there’s a more responsible way to approach the technology, mitigating the risks and maximizing the rewards.
Policies requiring companies to disclose where and how they’re using generative AI, as well as the corresponding energy consumption, would be a step in the right direction, says Lynn Kaack, a computer science and public policy expert at the Hertie School in Berlin. Regulating uses of the technology and access to it may prove difficult, but Kaack says that’s key to minimizing environmental and social harm.
Meanwhile, data centers and AI developers could take steps to lessen their carbon emissions and resource use, Chien says. Simple changes like training models only when there’s ample carbon-free power on the grid (say, on sunny days when solar panels produce an excess of energy) or subtly reducing system performance at times of peak energy demand might make a measurable difference. Replacing water-intensive evaporative cooling with liquid- immersion cooling or other closed-loop strategies that allow for water recycling would also minimize demand.
Each of these choices involves trade-offs. More carbon-efficient systems generally use more water, Ren says. There is no one-size-fits-all solution. The alternative to exploring and incentivizing these options — even if they make it marginally harder for companies to develop ever-bigger AI models — is risking part of our collective environmental fate, he says.
“There’s no reason to believe that technology is going to save us,” Chien says — so why not hedge our bets?
By Lauren Leffer / Science News











