Over the last three days, Meta has published three in-depth explainers about how its upcoming Meta Orion glasses work, focused on its custom silicon chips, holographic waveguides, and wireless compute puck. It’s a refreshing change of pace to see Meta let its engineers be so geekily passionate about their work.
Meta’s Reality Labs team leaked a lot of information about future Quest headsets and AR glasses until Meta started issuing anti-leak warnings to staff and firing accused leakers.

The tech world has always been hyper-fixated on secrecy while despising whistleblowers, leaving its biggest fans dependent on unreliable leaks and fake rumors as they wait for new products. By the time a product launches, the waters are muddied by leaks with incomplete or exaggerated information, making a device look worse before the engineers are allowed to explain the context.
So that’s what makes Meta’s current approach with Meta Orion so refreshing. Yes, this is a public proof-of-concept of Reality Labs’ work rather than a marketable product, but by eking out information about these glasses now instead of waiting until 2027, it helps inform the public on the challenges of AR glasses engineering and why they’re spending billions on research to address them.
Whether you’re an old “Glasshole,” a Ray-Ban Meta smart glasses owner waiting for an upgrade, or a Quest fan curious about AI glasses in general, I highly recommend you read Meta’s three blog posts for yourself. But I’ll summarize the highlights below.
Making custom chips was like ‘building the ship as it sails out of the harbor’
Image 1 of 3
(Image credit: Meta)(Image credit: Meta)
(Image credit: Meta)
In Meta’s blog post, titled How Our Custom Silicon & Chips Are Revolutionizing ARMeta explains why its AR glasses are “completely dependent” upon custom silicon instead of mobile-style chipsets.
“The glasses form factor can only dissipate so much heat, and it can only accommodate so much battery capacity,” the post explains. So “if you hold constant the thermal and battery capacities, the only way to deliver a given experience is to optimize the silicon.”
Director of SoC Solutions Robert Shearer explains that they had to “reduce power consumption by a factor of 100” before Meta Orion glasses would work.
Some of that came down to the wireless puck, but even that required “custom compression protocols to reduce bandwidth and power consumption as data moved from the compute puck to the display, says Director of Product Management Neeraj Choubey. He also described how they used machine learning to reduce the power needed for on-glasses eye and hand tracking.
The post proceeds to explain how this team also designed the custom silicon in the Orion microLED displays. Display Architecture Technical Director Mike Yee explains how they had to reduce the gap between pixel centers from “tens of microns” on phones to “the single digits,” which would only work on silicon.
Then, they needed to develop a “custom power management chip” that fit in the “tiny corner” of Orion’s displays to power the holographic content.
This only scratches the surface of the blog post, which is mostly a platform for this team to brag about how they’re “breaking free from traditional mental models” to make custom silicon. But considering the years of reports about Meta laying off this team or relying on MediaTek for its silicon, I’d say they deserve some time in the spotlight.
Silicon-carbide took holographic lenses from a ‘disco’ to a ‘symphony’
(Image credit: Meta)
The post on Our Silicon Carbide Waveguides & the Path to Orion’s Large FoV explains how Meta took advantage of a surplus of silicon-carbide material made for EV cars to make it a “viable option for AR glasses” because of its “high refractive index.”
Optical Scientist Pasqual Rivera says traditional glass lenses make AR glasses an unsightly rainbow “disco.” Silicon carbide was initially so “nitrogen-doped” that it was too dark to ever be used for glasses, but Meta’s team realized that it’s a “total game changer” with incredible clarity when it’s not optimized for cars.
AR Waveguides Tech Lead Giuseppe Calafiore explained how they had to stack multiple glass plates to hit a refractive index of 1.8, making glasses “prohibitively large and ugly.” So they tried different materials and eventually settled on silicon carbide, which can hit a 2.7 index with a single plate for better “etendue,” aka 50% better light spread for a wider field of view (FoV).
The post explains that wider FoV on AR glasses can cause “ghost images and rainbows” but that silicon carbide has properties to reduce them, as well as better thermal conductivity.
That said, the problem is undoubtedly price. Meta’s post concludes with Director of Research Science Barry Silverstein saying, “We’re seeing signs that it’s possible to significantly reduce the cost,” but that implies that for now, AR glasses either have to settle for worse materials or cost an arm and a leg for a functional display.
Letting XR nerds behind the scenes is the right call
(Image credit: Meta)
The third blog post about Orion’s computing puck explains why Meta quickly abandoned the idea of using a phone to power its glasses because “demanding performance requirements would drain the phone battery and suck away compute capacity from phone use cases.”
It then goes behind the scenes on how they envisioned the puck as a fully functional controller for AR games with 6DoF sensors and haptics, but that they’ve turned off those functions for the final version to focus more on eye/hand tracking and the EMG band. Still, one could imagine enthusiasts and devs getting those functions working if they wanted to.
That’s partially why I think this blog post isn’t just about boasting. It explains to skeptics why they chose a puck instead of a phone, curtails any false rumors about the puck being a controller in the final product, and explains the power demands AR glasses have that necessitate this weight in your pocket.
(Image credit: Android Central)
And that brings me back to my original point: I want more companies to let their engineers geek out and explain themselves before products reach the final stage. It primes journalists and hobbyists about the technical side of these products well before launch so they’re less reliant on leaks while also building up hype for products and features.
Leaked XR headsets like Valve Deckard have had VR super-nerds pouring over Valve’s code for years, looking for hints of what to expect. I’m not saying Valve should reveal its product early, but it’s been a long 5+ years since the Index launch with almost nothing VR-related aside from the occasional Steam Link update. Some behind-the-scenes info on what they’ve been working on would’ve been a nice nod to its patient fans.
I doubt Meta will let its engineers loose to talk about the Meta Quest 4 anytime soon, nor its leaked smart glasses. But since we all know these devices are coming, it’d be nice if Meta let loose enough to give us some hints and let engineers brag publicly about their work instead of assuming it can fire everyone until the leaks stop.
GIPHY App Key not set. Please check settings