Logo

AI has a lot to learn from the brain to be energy efficient 

“The brain has already solved many of the computational challenges we face,” says professor Christian Mayr.

Published on September 1, 2025

AI energy efficient

© Unsplash

Mauro swapped Sardinia for Eindhoven and has been an IO+ editor for 3 years. As a GREEN+ expert, he covers the energy transition with data-driven stories.

The decades of study on the human brain have unveiled its outstanding capabilities to process information, absorbing and reacting to stimuli in a matter of milliseconds. The brain's complexity and capabilities remain unmatched to this point. This is why there is still a lot to learn from it. 

Christian Mayr is the chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits at the Dresden University of Technology. Much of his research career has focused on the brain. “Throughout all this time, I have learned which brain concepts we should replicate,” he says. 

Simply put, neuromorphic computing is an approach to computing that mimics the way a human brain works. To Mayr, it is more about taking inspiration from the brain's computing principles. The professor is one of the speakers at the Watt Matters in AI conference, taking place in Eindhoven on November 26. 

Watt Matters in AI

Watt Matters in AI

Watt Matters in AI is a conference that aims to explore the potential of AI with significantly improved energy efficiency. In the run-up to the conference, IO+ publishes a series of articles that describe the current situation and potential solutions. Tickets to the conference can be found at wattmattersinai.eu.

View Watt Matters in AI

The brain: the world’s most-efficient computing machine

“The brain has already solved many of the computational challenges we face,” says Mayr. Chief among them all is energy efficiency. The brain is an extremely energy-efficient machine for processing information. To function, it uses as much as 20 watts of electricity, enough to power two LED light bulbs. Yet, with this small amount of power, a neuron can send 1,000 impulses per second. There are around 80 billion neurons in the human brain. 

“The brain is very good at deciding what synapses to put in ‘power down’ mode when. Moreover, once in power down, synapses also draw less power as compared to their counterparts in transistors,” explains the professor. 

Taking inspiration from the brain’s energy parsimony

How close are we to matching the brain’s efficiency? Mayr points to promising work at Google DeepMind, where researchers developed language models that, like the brain, activate only a tiny fraction of their neurons for each input. This brings AI closer to the brain’s parsimony. But even then, hardware remains a stumbling block: today’s chips cannot easily power down unused components the way the brain does.

Mayr: “At the algorithm level, we are heading in the right direction. The challenge is actually building hardware that is good at doing nothing, that knows power-down modes down to the single processing element. Still, there is a lot of power that is being burned to power communications and memory, and switching down those components is challenging.”

In a recent study, the professor took part in, computing at the local level is already quite efficient. Restricting memory access, data movement, communication, and computation across the upper layers accounts for a significant portion of the power bill. 

Christian Mayr
C

Christian Mayr

Chair of Highly-Parallel VLSI-Systems and Neuromorphic Circuits at the Dresden University of Technology

The professor has spent nearly three decades studying the way brain processes information.

A sound, effective network 

Another interesting feature of the brain, in contrast to modern AI computational systems, is its respect for physical limits. Basically, communication slows and weakens over distance. AI systems assume unlimited bandwidth and access to memory. 

“This is one of the fallacies of machine learning algorithms, but that’s not how a physical machine works,” notes Mayr. “Bandwidth depends on how close the two ends are. The brain has a sound architecture to do so, and, even there, it is very parsimonious on how to communicate and where.” 

Running multiple processes simultaneously

The brain is also adept at running multiple processes simultaneously. Consider, for instance, what happens when driving a car: the brain processes various sensory information, such as road signs, traffic noise, and the sensation of the road. All this data is continuously analyzed and used to make decisions. 

Interestingly, the brain rethinks all of its algorithms in an asynchronous fashion. Software developers and programmers typically design algorithms that operate in a serial, linear manner, repeating the same task over time. 

“The human brain has many sub-processes running at the same time. As they occur simultaneously, all these tasks are interconnected and update each other in real-time. There is no need for the interlocking used in parallel computing. The brain has truly solved this problem,” adds the professor. 

3D optical computing

Curbing AI energy use? 3D optical computing offers a solution

Reducing power usage, yet boosting AI chip perfomance, Lumai's 3D optical computing tech promises to curb AI energy use.

Our brains will keep inspiring computing 

What still fascinates the professor, after many years, is the difference between the way artificial and human brains work. “For instance, Chess and math are difficult for humans but easy for AI. Yet simple everyday tasks—like a robot navigating a kitchen and making tea—are effortless for us but still out of reach for machines,” he adds. 

Despite the hype around systems like ChatGPT, Mayr cautions against overstating their intelligence. “They’re impressive, but what they’re doing is closer to the lower levels of the brain—pattern recognition and signal processing,” he says. True human intelligence depends heavily on the associative cortex, which supports reasoning and symbolic thought. Replicating those higher-level processes, he believes, is still ahead of us.