As companies race to build AI into their products, there are concerns about the technology’s potential energy use. A new analysis suggests AI could match the energy budgets of entire countries, but the estimates come with some notable caveats.
Both training and serving AI models requires huge data centers running many thousands of cutting-edge chips. This uses considerable amounts of energy, for powering the calculations themselves and supporting the massive cooling infrastructure required to keep the chips from melting.
With excitement around generative AI at fever pitch and companies aiming to build the technology into all kinds of products, some are sounding the alarm about what this could mean for future energy consumption. Now, energy researcher Alex de Vries, who made headlines for his estimates of Bitcoin’s energy use, has turned his attention to AI.
In a paper published in Joule, he estimates that in the worst-case scenario Google’s AI use alone could match the total energy consumption of Ireland. And by 2027, he says global AI usage could account for 85 to 134 terawatt-hours annually, which is comparable to countries like the Netherlands, Argentina, and Sweden.
“Looking at the growing demand for AI service, it’s very likely that energy consumption related to AI will significantly increase in the coming years,” de Vries, who is now a PhD candidate at Vrije Universiteit Amsterdam, said in a press release.
“The potential growth highlights that we need to be very mindful about what we use AI for. It’s energy intensive, so we don’t want to put it in all kinds of things where we don’t actually need it.”
There are some significant caveats to de Vries’ headline numbers. The Google prediction is based on suggestions by the company’s executives that they could build AI into their search engine combined with some fairly rough power consumption estimates from research firm SemiAnalysis.
The analysts at SemiAnalysis suggest that applying AI similar to ChatGPT in each of Google’s nine billion daily searches would take roughly 500,000 of Nvidia’s specialized A100 HGX servers. Each of these servers requires 6.5 kilowatts to run, which combined would total a daily electricity consumption of 80 gigawatt-hours and 29.2 terawatt-hours a year, according to the paper.
Google is unlikely to reach these levels though, de Vries admits, because such rapid adoption is unlikely, the enormous costs would eat into profits, and Nvidia doesn’t have the ability to ship that many AI servers. So, he did another calculation based on Nvidia’s total projected server production by 2027 when a new chip plant will be up and running, allowing it to produce 1.5 million of its servers annually. Given a similar energy consumption profile, these could be consuming 85 to 134 terawatt-hours a year, he estimates.
It’s Important to remember though, that all these calculations also assume 100 percent usage of the chips, which de Vries admits is probably not realistic. They also ignore any potential energy efficiency improvements in either AI models or the hardware used to run them.
And this kind of simplistic analysis can be misleading. Jonathan Koomey, an energy economist who has previously criticized de Vries’ approach to estimating Bitcoin’s energy, told Wired in 2020—when the energy use of AI was also in the headlines—that “eye popping” numbers about the energy use of AI extrapolated from isolated anecdotes are likely to be overestimates.
Nonetheless, while the numbers might be over the top, the research highlights an issue people should be conscious of. In his paper, de Vries points to Jevons’ Paradox, which suggests that increasing efficiency often results in increased demand. So even if AI becomes more efficient, its overall power consumption could still rise considerably.
While it’s unlikely that AI will be burning through as much power as entire countries anytime soon, its contribution to energy usage and consequent carbon emissions could be significant.
Image Credit: AshrafChemban / Pixabay