Apple Glass will get custom Apple Silicon tailored for low power & camera control
Apple is reportedly custom-tailoring specialized chips, with the needs of the Apple Glass in mind.

Optimistic renders of what Apple Glass could look like - Image Credit: AppleInsider
Apple's internal silicon design team is behind the creation of chips used in the Apple Silicon line. After handling processing on iPhone, iPad, and Mac, the team are now working towards chips for other major product categories.
People familiar with the plans told Bloomberg on Thursday that there are chips being made not only for new Macs. It's also working on chips for smart glasses.
Apple Glass processing
The challenge that Apple faces for Apple Glass is that it has to fit into a very lightweight design. One that has constraints on weight, physical size, and power consumption. A custom chip evidently fits in as part of the solution.
According to report sources, the processor for the smart glasses will be based on chips used in the Apple Watch. The versions used in the wearable require less energy than counterparts in Apple's mobile devices, in part because they have been customized to leave elements out.
The smart glasses chip, meanwhile, will also have to deal with processing feeds from multiple cameras on the frames, pointing out into the world.
It is expected that mass production of the chips will start by the summer of 2026 to make a debut in hardware by the end of 2026 or into 2027, under the eyes of chip partner TSMC.
AR and non-AR
The actual Apple Glass hardware itself may not necessarily offer the highly touted augmented reality functionality, as a first smart glasses release could leave that element out entirely. Even after years of development, augmented reality is still impractical to offer in smart glasses.
Apple is reportedly working on two versions of smart glasses, supposedly under the codename N401. A non-AR version will follow in the footsteps of hardware like Meta's RayBan partnership, which can handle calls, photography, and works with a digital assistant.
For Apple's non-AR glasses, it is still working on the possibility of using cameras on the headwear to scan the environment. With cameras potentially being added to the Apple Watch and to AirPods, this could help expand the external awareness of Apple's AI.
Chips being made for the AirPods to handle this functionality are allegedly named "Glennie," while a counterpart for the Apple Watch is apparently "Nevis."
A big chip departure
The design of custom chips for smart glasses makes sense, especially when you consider Apple's current chip landscape.
For the Apple Vision Pro, Apple uses a dual-chip design, with the headset chiefly using the M2 for handling application processing needs. The R1, which accompanies it, is similarly high-powered and handles processing input from the plethora of onboard cameras, as well as handling positioning data and motion tracking, at an extremely high speed with minimal latency.
This is all great when you're dealing with a massive amount of processing in a not exactly small package. But, it becomes a problem when you are talking about smart glasses.
Smart glasses is a product category that relies on being in a small spectacles-like frame, with little space available for components and a need to minimize bulk. That means any chip design has to be made as small as can fit onto a typical pair of glasses.
Then there's the problem of available resources, as it will be considerably lower than a VR headset, simply to maintain the aesthetics of being spectacles. With reduced available battery life and no active cooling opportunities, that severely constrains a chip's potential.
One way around this is to offload the processing to another device, such as an iPhone kept in the user's pocket. An iPhone has a greater potential for processing, due to having a much larger battery and could feasibly remain cooler for longer, leaving the on-glasses chips for other local tasks.
Rumor Score: Likely
Read on AppleInsider
Comments
The headset was never going to be a huge deal. Glasses have thr potential to be really big, especially if they look like a slick, high end pair of glasses/sunglasses - even if they partner with Oakley, etc.
From what I've read, I think the basic mistake Apple is making is to try and come up with a standalone AR Glass product, rather than making it a companion product to the iPhone. By doing so, these glasses just have to do way too much given the space/weight limitations of glasses. More CPU power needed; more RAM; much more battery. As a companion product it would simply pass camera/sensor data on to the phone and display whatever the phone tells it to display
At least Apple doesn’t have to worry about Intel, Meta, Google, Nvidia or Microsoft it’s far beyond their pay grade. However Qualcomm maybe the only one if they can get Microsoft (OS) on board.
"Meta recently sent an email to Ray-Ban Meta users that said, in part, "Meta AI with camera use is always enabled on your glasses unless you turn off ‘Hey Meta,'” and “the option to disable voice recordings storage is no longer available.” Basically, Meta is vowing to look at what I'm looking at and store whatever I say, so you could argue there are some pretty big privacy concerns."
NOBODY I know wants cameras and microphones pointed at them by some idiot wearing what should properly be called "Incel Glasses." In terms of social interaction, they are anything but "smart," and the original "glasshole" name applied to wearers of Google Glass that debuted a dozen years ago still very much applies. I cannot imagine Apple getting into such a socially repulsive business.