Logo

Chips, AI, the energy squeeze: We need brain-inspired computing

As AI drives an explosion in chip demand and data center power use, researchers point to neuromorphic computing as part of the solution.

Published on October 17, 2025

Prof. HANS HILGENKAMP

Prof. Hans Hilgenkamp

Bart, co-founder of Media52 and Professor of Journalism oversees IO+, events, and Laio. A journalist at heart, he keeps writing as many stories as possible.

Artificial intelligence is rewriting the rules of technology, but it is also rewriting the world’s energy balance sheet. Every query to a large language model, every automated image recognition system, every connected device, rests on the shoulders of billions of tiny transistors switching on and off at mind-bending speeds.

Professor Hans Hilgenkamp, scientific director at the MESA+ Institute of the University of Twente, describes it with a striking contrast: “One transistor switch consumes just 10⁻¹⁷ Joules. That’s unimaginably small. But multiply it by the approximate 10³⁶ switches humanity performs every year, and you end up with an energy bill that already accounts for several percent of global electricity use.”

And the demand is only accelerating. The rise of AI, the Internet of Things, and cloud services is driving the number of transistor operations - and with it, data traffic and storage needs - through the roof. “It’s not just the switching,” Hilgenkamp explained in a recent talk. “Moving data, whether on a chip or across to a data center, consumes similar amounts of energy. Data centers themselves require cooling, which results in even higher electricity and water consumption. The system as a whole is becoming a major driver of energy consumption.”

AI’s nuclear appetite

The implications are not abstract. Across Europe and beyond, new data centers are straining local grids, forcing governments to rethink energy infrastructure. In some cases, dormant nuclear plants are being reconsidered to meet demand. But is this the solution? “You can restart one plant today,” Hilgenkamp noted, “but in two years you’ll need two more. The growth is exponential.”

Analysts estimate that information and communication technologies (ICT) already consume around 5 percent of global electricity, a share that could double within a decade if AI adoption continues unchecked. The semiconductor industry is approaching fundamental physical limits: switching transistors much more efficiently than a projected 10⁻¹⁸ Joule threshold encounters barriers of thermal noise and astronomical equipment costs.

That leaves a sobering reality: unless computing itself changes, AI risks becoming an energy crisis of its own.

Watt Matters in AI
Series

Watt Matters in AI

Watt Matters in AI is a conference that aims to explore the potential of AI with significantly improved energy efficiency. In the run-up to the conference, IO+ publishes a series of articles that describe the current situation and potential solutions. Tickets to the conference can be found at wattmattersinai.eu.

Looking to the brain for answers

This is where neuromorphic computing comes into play. Instead of relying on the decades-old “von Neumann architecture”, where memory and processing are kept separate, shuffling data back and forth at high energy cost, neuromorphic approaches take inspiration from the brain.

“Our brains are the most energy-efficient processors we know,” Hilgenkamp said. “They run for a day on the equivalent of a peanut butter sandwich, about 20 watts, yet can outperform supercomputers in pattern recognition and learning.”

The key difference is that in the brain, memory and computation are intertwined. Neurons and synapses both store and process information, allowing massively parallel, low-power operations. By mimicking this structure, neuromorphic chips could cut the energy cost of AI tasks by orders of magnitude.

From synapses to silicon

Some of this is already visible in today’s AI hardware. Neural networks, used in applications ranging from ChatGPT to medical imaging, are themselves a simplified representation of how biological neurons connect and learn. Chipmakers like Nvidia have thrived because their graphics processors excel at the core operation these networks rely on: vector-matrix multiplication.

But Hilgenkamp stresses that the next leap requires new materials and designs. One promising route involves memristors, resistive elements that can “remember” their state and change conductivity in response to voltage pulses, much like synapses adjust their strength when we learn. Researchers at IBM Zurich and in the Netherlands are experimenting with phase-change materials and disordered semiconductors to build these adaptive components. Early prototypes have already demonstrated impressive gains in tasks such as speech recognition.

Other concepts, such as spiking neural networks, mimic the brain’s method of transmitting information through voltage spikes rather than steady digital signals. Intel, IBM, and several academic groups have demonstrated chips that can run such networks far more efficiently than conventional processors.

Urgency meets opportunity

The stakes are clear: without breakthroughs in architectures like neuromorphic computing, AI’s energy demand may outpace what grids and societies can support. With them, we may gain not only efficiency, but also new forms of computing better suited to the messy, adaptive challenges of real-world intelligence.

Hilgenkamp points to growing momentum in this field. Dutch researchers, startups, and global industry leaders are all exploring paths “beyond von Neumann.” National and European programs are beginning to channel funding into neuromorphic R&D. On November 26, Eindhoven will host the conference "Watt Matters in AI?", dedicated to exactly these questions: how big is the problem, what technologies might help, and what policies are needed to keep AI’s footprint in check. Hilgenkamp chairs the program committee for this conference.

“AI is not just a technological revolution,” Hilgenkamp concluded. “It is also an energy revolution. To make it sustainable, we need to look beyond the chips we know, and learn from the most efficient computer we already have: the human brain.”