Logo

AI’s energy addiction threatens climate targets - and our grid

New research estimates that AI accounts for 20% of global data center electricity use; it could rise to nearly 50% by the end of this year.

Published on May 25, 2025

Watt Matters in AI

Bart, co-founder of Media52 and Professor of Journalism oversees IO+, events, and Laio. A journalist at heart, he keeps writing as many stories as possible.

Artificial intelligence (AI) is revolutionizing our world, powering everything from virtual assistants to sophisticated data analysis. However, beneath its sleek algorithms lies a growing concern: the substantial energy consumption required to train and operate these systems. AI is smart, but is it sustainable when it eats power like a country? Even with the brightest minds fueling the system, blackouts may be closer than we think.

Watt Matters in AI

This article is part of our Watt Matters in AI series. Watt Matters in AI is also a conference that aims to explore the potential of AI with significantly improved energy efficiency. In the run-up to the conference, IO+ publishes the articles that describe the current situation and potential solutions. Tickets to the conference can be found at wattmattersinai.eu.

View Watt Matters in AI Series

Recent research by Alex de Vries-Gao, founder of Digiconomist, highlights the escalating energy demands of AI. His study, published in Joule, estimates that AI currently accounts for up to 20% of global data center electricity use, which could rise to nearly 50% by the end of this year. This surge is primarily driven by the widespread adoption of large language models like ChatGPT, which require immense computational power.

To put this into perspective, training a single large AI model can consume as much electricity as the annual usage of several hundred homes, De Vries-Gao says. His analysis suggests that AI could consume up to 82 terawatt-hours in 2025, equivalent to Switzerland’s annual power usage. Nevertheless, training the models is only a small part of the problem. The bigger story now, say MIT Technology Review journalists James O’Donnell and Casey Crownhart, is what happens after training: inference - the real-time process of responding to user queries. “Inference is where the bulk of energy use is happening now,” O’Donnell said. “Researchers told us that 80 to 90% of the electricity consumed by AI models today goes into inference, not training.”

Watt Matters in AI

No escape: AI is everywhere - and what that means for the planet

Even if you want to avoid using AI, you probably can’t. The rising energy costs are worrying, an MIT Technology Review roundtable showed.

Implications for sustainability goals

The rapid growth in AI's energy consumption poses significant challenges for sustainability. Tech giants like Google have reported sharp increases in greenhouse gas emissions, with a 48% rise since 2019, complicating their net-zero targets. The energy-intensive nature of AI not only strains power grids but also increases reliance on fossil fuels, undermining efforts to combat climate change.

One of the critical issues highlighted by De Vries-Gao is the lack of transparency in reporting AI's energy usage. Without precise data, assessing the environmental impact accurately or implementing effective mitigation strategies is challenging. Experts like Sasha Luccioni from Hugging Face emphasize the importance of corporate disclosures to understand and manage AI's energy footprint.

Watt Matters in AI

Watt Matters in AI is an initiative of Mission 10-X, in collaboration with the University of Groningen, University of Twente, Eindhoven University of Technology, Radboud University, and the Convention Bureau Brainport Eindhoven. IO+ is responsible for marketing, communication, and conference organization.
More information on the Conference website.

Towards a sustainable AI future

Addressing AI's energy consumption requires a multifaceted approach, De Vries-Gao concludes. Improving the energy efficiency of AI models, investing in renewable energy sources for data centers, and implementing regulatory frameworks for transparency are essential steps. Moreover, considering the environmental impact during the design and deployment of AI systems can help align technological advancement with sustainability goals.

As AI continues to evolve, balancing its benefits with environmental responsibility is crucial. Recognizing and addressing AI's hidden energy costs might ensure that innovation does not come at the expense of our planet's health.

Many of the articles in our Watt Matters in AI series discuss potential solutions, for example in hardware, software, behavior, or regulations.

Watt Matters in AI banner