News: 1747856650

  ARM Give a man a fire and he's warm for a day, but set fire to him and he's warm for the rest of his life (Terry Pratchett, Jingo)

Estimating AI energy usage is fiendishly hard – but this report took a shot

(2025/05/21)


A single person with a serious AI habit may chew through enough electricity each day to keep a microwave running for more than three hours. And the actual toll may even be worse, as so many companies keep details about their AI models secret.

Speaking to two dozen researchers tracking AI energy consumption, and conducting experiments on its own, the MIT Technology Review [1]concluded that the total energy and climate toll is incredibly difficult to get a handle on. But that didn't stop them from trying.

Working with researchers from Hugging Face, the authors of the report determined that the energy used for a single query to the open-source Llama 3.1 8B engine used around 57 joules of energy to generate a response. (That 8B means the engine uses 8 billion parameters). When accounting for cooling and other energy demands, the report said that number should be doubled, bringing a single query on that model to around 114 joules – equivalent to running a microwave for around a tenth of a second. A larger model, like Llama 3.1 405B, needs around 6,706 joules per response – eight seconds of microwave usage.

[2]

In other words, the size of a particular model plays a huge role in how much energy it uses. Although its true size is a mystery, OpenAI's GPT-4 is estimated to have well over a trillion parameters, meaning its per-query energy footprint is likely far higher than the Llama queries tested.

[3]

[4]

It's also worth pointing out that those figures are for text-based responses. AI-generated photos actually use considerably less than text responses thanks to their smaller model size and the fact that diffusion is more energy efficient than inference, MIT TR noted.

AI video generation, on the other hand, is an energy sinkhole.

[5]

In order to generate a five-second long video at 16 frames per second, the CogVideoX AI video generation model consumes a whopping 3.4 million joules of energy – equivalent to running a microwave for an hour or riding 38 miles on an e-bike, Hugging Face researchers told the Tech Review.

"It's fair to say that the leading AI video generators, creating dazzling and hyperrealistic videos up to 30 seconds long, will use significantly more energy," the report noted.

Using that data, the authors compiled an estimate to look at the daily AI energy consumption of someone with a habit of leaning on generative models for various tasks. Fifteen questions, ten attempts at generating an image, and three tries at making an Instagram-ready five-second video would eat up the aforementioned estimate of 2.9 kWh of electricity – three and a half hours of microwave usage.

[6]

Hundreds of millions of people around the world use ChatGPT per week, OpenAI [7]estimates .

The researchers focused on open-source LLMs that we know a lot about. But companies like OpenAI and Google keep the size and reach of their models hidden from the public, and that that seriously hampers accurate energy usage estimates.

DIY AI energy estimates

Hugging Face engineer Julien Delavande published a tool last month called [8]ChatUI-Energy that is able to be plugged into any open-source AI model to get an estimate of its energy usage.

The tool estimates energy usage in both watt-hours and joules, and provides comparisons to real-world energy usage equivalents like smartphoner batter usage, microwave time and the like.

ChatUI-Energy's online demo uses Alibaba's Qwen2.5-VL-7B-instruct model with a few others available to try out. ChatUI-Energy's source code is also [9]available on GitHub for use with other open-source models.

It's not perfect, and like any other AI energy modeling tool it makes a lot assumptions to arrive at its calculations, which in our experiments weren't always consistent. Nonetheless, Hugging Face's AI and climate lea Sasha Luccioni told us that the project is meant to showcase the need for more transparency in AI's energy usage .

"What we really need is less guesswork, more transparency, and tools like this that give AI users the information they need about the planetary cost of using AI," Luccioni said in an email. "Just like ingredients have to be shown on products you buy."

When it comes to measuring CO2 emissions, the AI picture gets even more complicated, the Tech Review article notes. The mixture of renewable and non-renewable energy sources varies widely by location and time of day (solar isn't used at night, for instance).

The report also didn't touch on [10]prompt caching , a technique commonly used by generative models to store responses and feed them back to users asking the same or similar questions, which can reduce energy consumption for AI models.

Dirty deeds, not dirt cheap

Regardless of those caveats, one thing is for sure: a lot of energy is being consumed to power the world's growing AI habit. Not only that, but a good portion of it is spewing carbon into the atmosphere for what is arguably [11]questionable [12]usefulness .

[13]Real datacenter emissions are a dirty secret

[14]Energy companies told to recharge for AI datacenter surge

[15]Dot com era crash on the cards for AI datacenter spending? It's a 'risk'

[16]OK great, UK is building loads of AI datacenters. How are we going to power that?

As the Tech Review report pointed out, the current spike in datacenter energy usage follows years of relatively flat consumptions thanks to steady workloads and ever-increasing efficiency. Datacenters ate up [17]more than a fifth of the electricity in Ireland in 2023. The global energy consumption of datacenters is predicted to [18]more than double from its current rates by 2030, surpassing the energy consumption of the [19]entire nation of Japan by the start of the next decade. AI, naturally, is the largest driver of that increase.

There has been a lot of lip service paid to going green by tech companies over the years, who've long assured the public that their bit barns aren't an environmental threat. But now that AI's in the picture, the professed net-zero goals of tech giants like Microsoft and Google are [20]receding into the distance .

We've covered this a lot at The Register of late, and our reporting largely aligns with what the MIT tech Review report concluded: The energy behind AI [21]is [22]far [23]dirtier [24]than tech companies would [25]like us [26]to [27]believe .

Overall, datacenters are predicted to emit 2.5 billion tons of greenhouse gases by the end of the decade. That's [28]three times more than they would have if generative AI hadn't become the latest craze.

To add insult to apocalypse, those numbers rest on a shaky data foundation, as the Tech Review report noted.

"This leaves even those whose job it is to predict energy demands forced to assemble a puzzle with countless missing pieces, making it nearly impossible to plan for AI's future impact on energy grids and emissions," they said. ®

Get our [29]Tech Resources



[1] https://www.technologyreview.com/2025/05/20/1116327/ai-energy-usage-climate-footprint-big-tech/

[2] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=2&c=2aC5M-l889TeecXgYWLNUQAAAA0E&t=ct%3Dns%26unitnum%3D2%26raptor%3Dcondor%26pos%3Dtop%26test%3D0

[3] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aC5M-l889TeecXgYWLNUQAAAA0E&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[4] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aC5M-l889TeecXgYWLNUQAAAA0E&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[5] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=4&c=44aC5M-l889TeecXgYWLNUQAAAA0E&t=ct%3Dns%26unitnum%3D4%26raptor%3Dfalcon%26pos%3Dmid%26test%3D0

[6] https://pubads.g.doubleclick.net/gampad/jump?co=1&iu=/6978/reg_software/aiml&sz=300x50%7C300x100%7C300x250%7C300x251%7C300x252%7C300x600%7C300x601&tile=3&c=33aC5M-l889TeecXgYWLNUQAAAA0E&t=ct%3Dns%26unitnum%3D3%26raptor%3Deagle%26pos%3Dmid%26test%3D0

[7] https://www.forbes.com/sites/martineparis/2025/04/12/chatgpt-hits-1-billion-users-openai-ceo-says-doubled-in-weeks/

[8] https://huggingface.co/spaces/jdelavande/chat-ui-energy

[9] https://huggingface.co/spaces/jdelavande/chatui-energy/tree/main

[10] https://platform.openai.com/docs/guides/prompt-caching

[11] https://www.theregister.com/2024/11/28/linkedin_ai_posts/

[12] https://www.theregister.com/2024/04/13/google_ai_spam/

[13] https://www.theregister.com/2025/01/22/datacenter_emissions_not_accurate/

[14] https://www.theregister.com/2024/10/11/energy_companies_ai_dcs_consultant_report/

[15] https://www.theregister.com/2025/04/14/datacenter_spending_ai/

[16] https://www.theregister.com/2025/04/10/uk_ai_energy_council_meets/

[17] https://www.theregister.com/2024/07/25/ireland_datacenter_power_consumption/

[18] https://www.theregister.com/2025/02/07/datacenter_energy_goldman_sachs/

[19] https://www.theregister.com/2025/04/12/ai_double_datacenter_energy/

[20] https://www.theregister.com/2025/01/16/ai_datacenters_putting_zero_emissions/

[21] https://www.theregister.com/2025/05/08/xai_turbines_colossus/

[22] https://www.theregister.com/2025/03/13/microsoft_natural_gas_ai/

[23] https://www.theregister.com/2024/09/06/datacenters_set_to_emit_3x/

[24] https://www.theregister.com/2024/12/05/meta_largestever_datacenter/

[25] https://www.theregister.com/2025/05/07/google_signs_another_nuclear_deal/

[26] https://www.theregister.com/2025/04/12/ai_hyperscalers_sustainability/

[27] https://www.theregister.com/2025/05/16/amazon_nuclear_power_britain/

[28] https://www.theregister.com/2024/09/06/datacenters_set_to_emit_3x/

[29] https://whitepapers.theregister.com/



Microwave oven consumption

David M

The microwave oven equivalent figures fail to take into account the fact that the magnetron in a microwave oven is only about 65% efficient, for example my 800W microwave has a rated input power of 1270W. So taking one of the examples from the article, 3.4MJ would correspond to 3.4e6/1270 = around 45 minutes, not an hour as stated. The point of the article still stands, of course—that AI uses a shocking amount of electricity to deliver very little value—but we might as well get the engineering right.

Five second video equals 38 miles on an ebike?

DS999

There's your solution - if people want to have AI create video for them make them generate the power themselves on a Pelaton or similar ebike. That solves two problems - one being AI's excessive usage for the complete waste of time generating deepfake vidoes, and two being that there would be a lot less deepfake videos since a lot of the people producing it probably couldn't manage 38 seconds on a bike let alone 38 miles!

I also never expected Intel to dispose of themselves in such
a cute way.

- Rik van Riel on linux-kernel