AI’s energy hunger brings urgency and common ground among experts
At the launch of TU/e’s Cazimir Institute, speakers warned of AI’s rising electricity footprint: "Square ambition with sustainability."
Published on October 1, 2025

Bart Smolders is rooting for energy-efficient, sustainable solutions. © TU/e / Bart van Overbeeke
Bart, co-founder of Media52 and Professor of Journalism oversees IO+, events, and Laio. A journalist at heart, he keeps writing as many stories as possible.
At the launch of the new Cazimir Institute at Eindhoven University of Technology, one theme surfaced in nearly every talk: the staggering energy consumption of artificial intelligence. From chip designers to policymakers, all agreed: AI’s growth curve will collide with the planet’s energy limits unless solutions are found.
“Even if Moore’s Law continues, it will by far not be enough to cope with the exponential growth of AI’s energy consumption,” warned Bart Smolders, full professor and Chair of the Electromagnetics (EM) group of the Department of Electrical Engineering and one of the initiators of the institute. “We have to come up with new ways of computing, new concepts, and out-of-the-box thinking.”
Richard Kemkers, head of research at ASML, took the point further. “If we extrapolate today’s AI demand over the next ten years, we run into a power wall. An electricity supply problem is unavoidable unless we act,” he said. “We cannot let this happen. The industry will need to adjust, and innovation is the only way forward.”
From imec’s Jo de Boeck came a similar note of caution. He highlighted that every layer of the AI stack, from algorithms to chip design, adds to the energy challenge. “The complexity will only grow. Reducing cost and making sure we use the right amount of energy, while dealing with heat and system complexity, is a central challenge,” he argued.
Speakers not only highlighted the problem but also identified pathways forward. NXP’s CTO Lars Reger pointed to edge computing as one way to reduce the burden of data centers: “Energy efficiency is the next thing, it’s not always performance. Smarter devices at the edge can cut the need for constant cloud access”.
For policymakers, too, the issue is pressing. Pierre Chastanet, Head of the Unit Microelectronics and Photonics at the EU, placed AI compute alongside autonomous driving and industrial robotics as the biggest future demand drivers for semiconductors. The implication is that Europe’s Chips Act cannot focus solely on supply resilience but must also address efficiency.
Aida Todri-Sanial, full professor in the Integrated Circuits group of TU/e's Electrical Engineering department and one of the event's hosts, reminded the audience that the institute’s vision was rooted in responsibility. “The future is about energy efficiency, sustainability, and responsibility,” she said.
The closing panel captured the day’s spirit: Europe’s technological future depends not only on leading in chips and AI, but also on doing so responsibly. “Collaboration is key,” noted Chastanet. “We must align research, industry, and policy to invent solutions that do not yet exist”.
Watt Matters in AI
That shared sense of urgency and possibility at the Cazimir launch silently set the stage for the Watt Matters in AI conference on November 26 in Eindhoven. If 'Cazimir' showed anything, it is that across companies, universities, and governments, there is a growing recognition that AI’s energy hunger is one of the defining technological challenges of our time.
Concern about AI’s energy footprint is no niche worry. Industry leaders, chip foundries, research institutes, regional governments, and EU officials all flagged the same problem. That convergence is a rare moment of consensus in technology discourse: almost everyone accepts that AI’s energy use is a real constraint, not a speculative excess.
Moreover, the breadth of the concerns - cooling, grid load, energy generation, chip efficiency, edge vs. cloud, algorithmic pruning - means the solutions must be systemic. A conference that brings energy systems engineers, AI designers, policy experts, grid architects together can create rare friction. Solutions can emerge not from one discipline alone, but from collision and collaboration across them.
Finally, the urgency is real. The Cazimir speakers suggested that if we postpone serious action, we risk a future where AI consumption is a runaway driver of global electricity demand. It will no longer be possible for AI to stay behind the scenes; it could become the foreground of energy competition. We cannot treat compute as costless.

Watt Matters in AI
Watt Matters in AI is a conference that aims to explore the potential of AI with significantly improved energy efficiency. In the run-up to the conference, IO+ publishes a series of articles that describe the current situation and potential solutions. Tickets to the conference can be found at wattmattersinai.eu.
View Watt Matters in AIWatt Matters in AI offers precisely the forum in which that message can crystallize into pathways. The event can:
- Surface measurable metrics and benchmarks (watts per inference, energy per parameter, real-time emissions).
- Showcase novel chip and architecture research oriented toward energy minimalism.
- Bridge AI algorithm and hardware communities so pruning, quantization, sparse models, and new arithmetic schemes are co-designed.
- Engage grid and energy system planners, to ensure computing deployment is co-optimized with generation, storage, and flexibility.
- Provide a venue for policy proposals and energy incentives that embed energy cost into compute decisions.
In short, Watt Matters in AI can become the meeting place where shared concerns coalesce into a shared strategy.
If the Cazimir launch was a warning shot, Watt Matters is the call to arms. On November 26, Eindhoven offers more than a conference: it offers a chance to align momentum in AI not just with ambition, but with restraint, accountability, and sustainable design.
Want to know more about the Watt Matters in AI event? Click here. Tickets are still available here.