“Smart glasses are the next AI interface”
Applied Materials maps the road from AR lab demos to mass-market eyewear.
Published on November 8, 2025
Rutger Thijssen, Applied Materials
Bart, co-founder of Media52 and Professor of Journalism oversees IO+, events, and Laio. A journalist at heart, he keeps writing as many stories as possible.
At PIC Summit Europe, Rutger Thijssen (Applied Materials) explained how the company is taking its photonics know-how beyond the fab; building waveguides, metrology, and a Singapore production line with GlobalFoundries to scale true eyewear-grade AR for millions of users.
The industry has built stunning headsets - Apple Vision Pro, Magic Leap 2, and others - but they’re not glasses. “They’re packed full of technology with amazing capabilities,” said Rutger Thijssen of Applied Materials on stage in Eindhoven, “but they’re not something you’ll wear all day.” In Applied’s view, smart glasses aren’t just another wearable. “They’re the next interface for AI; the way AI talks to you, and the way you talk to AI. But first and foremost, they must be eyewear.”
That framing drives a very different product definition: devices that look and feel like ordinary spectacles, yet can see what you see, understand context, and overlay information without fatigue or social friction. To get there, the display system must all but disappear. “We’re working on the waveguide, the part you look through,” Thijssen said. “If we do our job really well, it becomes invisible.”
From making tools to making what people see
Applied’s starting thesis was simple: the company already makes equipment to build nanostructures that manipulate electrons. Why not use the same prowess to build nanostructures that manipulate photons? That led to a strategic shift: Applied won’t sell glasses and won’t just sell tools; it wants to supply significant AR components, notably surface-relief grating (SRG) waveguides and the bespoke metrology that guarantees their quality at scale.
That change also flips a long-held assumption. In semiconductors, cosmetic defects rarely matter to end users. In eyewear, they’re fatal. “All of a sudden, cosmetic defects become critical, and they show up in weird ways,” Thijssen noted. Hence, a new layer of inspection: in addition to CD-SEMs and defect tools, Applied is building dedicated optical metrology to measure contrast, sharpness, color uniformity, eye glow, and field performance: every die on every wafer.
How the image gets from the engine to the eye
Thijssen walked the audience through a modern SRG combiner: three gratings - input, propagation/replication, and output - create a stable eyebox while minimizing artifacts. The physics is unforgiving: the same features that diffract red at one angle will also diffract green and blue at others, and any residual structure can shimmer in ambient light. “The better the nanoscale engineering, the better the device,” he said.
The core KPI is efficiency. More light through the waveguide enables brighter images or longer battery life. But efficiency trades off against other user-critical metrics: color uniformity, eye glow, and distortion of the outside world. Materials matter too. Applied’s simulations and builds using an internal PDK point to high-index films deposited and patterned with leading-edge semiconductor tools to hit the required performance envelope.
Reference designs - and the lenses you don’t see
Applied has assembled reference designs with partners: a high-performance etched waveguide co-optimized with a compact light engine, sensors, and integrated lenses. Why extra lenses? Waveguides often project at “optical infinity,” which your brain perceives as wrong for near-field overlays. “You put a small lens in front to pull the image closer,” Thijssen said. “That distorts the real world, so you add another lens to correct it.” The sequence also enables prescription correction without adding bulk. All this in a sub-50-gram package.
Unlike phones with a few dozen SKUs, glasses are deeply personal: pupillary distance, face geometry, and prescriptions. Yet the industry must manufacture millions of parts with semiconductor-style repeatability. That’s why Applied is building a fab with GlobalFoundries in Singapore. “Santa Clara is great for R&D,” Thijssen quipped, “not for volume.” Tools are moving in; the team is “slowly lighting it up.”
Scaling is the hardest cultural shift. “R&D and pilot runs speak one language,” he said. “High-volume manufacturing speaks another.” Tacit know-how must become robust playbooks: prove a recipe in one chamber, port it to another, then replicate it across the world. “Navigating that is a huge list of very practical things you don’t always think about in the CTO’s office.”
The open problems—and a call to collaborate
Asked about customers and timelines, Thijssen stayed within NDAs, but confirmed demand is real and urgent. The biggest external dependency? Brighter light engines. “Our job is making an efficient waveguide matched to the engine,” he said. “At the end of the day, it’s how many photons you can get per watt.” He invited continued discussion with light-engine experts, one of several threads Applied is weaving into a broader development platform that spans optics, compute (on-device, phone, cloud), AI software, fashion/ID, audio, power, and consumer channels.
If smart glasses are the primary interface for ambient AI, they must vanish into daily life: comfortable, stylish, all-day wearable. That demands HVM-grade photonics with semiconductor discipline, optical metrology you can trust, and supply chains that accommodate personalization. It’s a different kind of scaling challenge than chips, but one that the chip industry’s playbook can help solve.
“We believe glasses will be how people experience AI,” Thijssen concluded. “To make that mainstream, we need waveguides you don’t notice, displays you forget are there, and factories that can make them by the million, yet fit each face. That’s the platform we’re building.”
