Curbing AI energy use? 3D optical computing offers a solution
Reducing power usage, yet boosting AI chip perfomance, Lumai's 3D optical computing tech promises to curb AI energy use.
Published on August 21, 2025

Mauro swapped Sardinia for Eindhoven and has been an IO+ editor for 3 years. As a GREEN+ expert, he covers the energy transition with data-driven stories.
“In order to make a difference in the market of AI computing, you have to deliver something that will be much better than the competitors. To do so, you have to take a very different technical approach,” states Phillip Burr. He is the head of product of Lumai, a British startup that has taken on a different path toward AI chip development, putting energy efficiency at the core of its solution.
Lumai is working on an AI accelerator, a processor that promises to improve the energy efficiency of AI computations. It leverages 3D optical computing, using light to process information. According to the company, a spinoff of Oxford University, its technology can slash by 90% the energy consumption of AI chips. At the same time, the company is on a roadmap to increase chip performance by fifty times as compared to current tech.
As AI development and usage surge, so does the energy demand associated with it. According to a recent study, AI can use up to 82 terawatt hours of power in 2025–equivalent to Switzerland’s yearly energy consumption. “Ai’s energy demand won’t go down, but we can work on slowing such an increase,” adds Burr. He will be one of the speakers at the upcoming Watt Matters in AI conference, happening in Eindhoven on November 26.

Watt Matters in AI
Watt Matters in AI is a conference that aims to explore the potential of AI with significantly improved energy efficiency. In the run-up to the conference, IO+ publishes a series of articles that describe the current situation and potential solutions. Tickets to the conference can be found at wattmattersinai.eu.
View Watt Matters in AI3D optical computing: minimizing AI energy usage
Graphics processing units (GPUs) are the hardware cornerstone of AI development, enabling complex computations. The American NVIDIA is currently the GPU market leader– and the world’s most valuable company at the moment. At the heart of AI’s large language models (LLMs) processing lies a mathematical operation that is being calculated repeatedly, called Matrix-vector multiplication (MVM). Essentially, the data streams are multiplied many times.
In 3D optical computing, this data is encoded in a light beam that passes through a lens, which effectively copies data across the matrix. “As a result, you can perform many operations in the light domain, where an insignificant amount of energy is used,” explains Lumai’s head of product.
Companies working with silicon technology are adding more silicon to get faster, more powerful AI solutions, yet needing more power to operate. For example, an NVIDIA H200 GPU can consume up to 700 watts of power. OpenAI’s first European datacenter, Stargate Norway, will have 100,000 operating GPUs.
Burr: “Therefore, the costs for building a datacenter ramp up: the more power you use, the bigger the power supplies and the cooling equipment you need to operate it. Furthermore, performance can’t be simply increased more and more, because then you hit thermal issues. Getting heat out of chips is challenging, and designing chips that perform good thermal management can get expensive.”

Philipp Burr
Head of product at Lumai
After a long career in the semiconductor industry, he joined Lumai fascinated by their innovative approach towards AI computing.
Targeting AI inference
The British startup, employing 20 people, targets AI inference in datacenters. When developing a large language model (LLM), there is an initial training phase, when the model is fed with information. Then comes the inference phase, where the acquired knowledge is being tested, with the model making decisions and predictions on new, unseen data.
According to MIT Technology Review journalists James O’Donnell and Casey Crownhart, the bulk of the AI’s energy consumption happens in inference, rather than in initial training. Researchers revealed that up to 90% of the electricity consumed by AI models powers this second phase of the AI model creation.
In addition to the energy saving it promises to offer, Lumai’s solution would come at a fraction of the cost of ownership of conventional technology. Earlier this year, the company raised over $10 million in funding.
The rise of optical tech
The use of optical technology in datacenters is not new. Google has deployed a similar technology in its optical circuit switch (OCS) product, which, instead of electronic circuit systems, uses optical signals to handle communication across a network. The American giant utilizes OCS to switch among different computing clusters, for instance.
“What they have recently found out is that the reliability of their data center improves when using OCS,” adds Burr. Although being used for a different purpose, Google’s adoption proved the value of this technology, which has the advantages of reducing the cost of the network while increasing fiber transmission capacity. Lumai’s head of product sees a general increase in the use of optical technologies in datacenters.
AI energy efficiency is an industry duty
In his talk at Watt Matters in AI, Burr will strengthen the case for 3D optical computing, presenting Lumai and the advantages this technology can bring. At the same time, he is keen to discuss the implications it can have with the other conference visitors.
“As an industry, we have the duty to make AI as energy efficient as possible. No company or individual can do this alone. I’m looking forward to exploring and discussing potential collaboration opportunities with fellow attendees,” he concludes.