Apple Glass will get custom Apple Silicon tailored for low power & camera control

Jump to First Reply
Posted:
in Future Apple Hardware edited May 8

Apple is reportedly custom-tailoring specialized chips, with the needs of the Apple Glass in mind.

Two pairs of sunglasses, one white and one black, with sleek square frames, displayed against a dark gradient background.
Optimistic renders of what Apple Glass could look like - Image Credit: AppleInsider



Apple's internal silicon design team is behind the creation of chips used in the Apple Silicon line. After handling processing on iPhone, iPad, and Mac, the team are now working towards chips for other major product categories.

People familiar with the plans told Bloomberg on Thursday that there are chips being made not only for new Macs. It's also working on chips for smart glasses.

Apple Glass processing

The challenge

that Apple faces for Apple Glass is that it has to fit into a very lightweight design. One that has constraints on weight, physical size, and power consumption. A custom chip evidently fits in as part of the solution.

According to report sources, the processor for the smart glasses will be based on chips used in the Apple Watch. The versions used in the wearable require less energy than counterparts in Apple's mobile devices, in part because they have been customized to leave elements out.

The smart glasses chip, meanwhile, will also have to deal with processing feeds from multiple cameras on the frames, pointing out into the world.

It is expected that mass production of the chips will start by the summer of 2026 to make a debut in hardware by the end of 2026 or into 2027, under the eyes of chip partner TSMC.

AR and non-AR



The actual Apple Glass hardware itself may not necessarily offer the highly touted augmented reality functionality, as a first smart glasses release could leave that element out entirely. Even after years of development, augmented reality is still impractical to offer in smart glasses.

Apple is reportedly working on two versions of smart glasses, supposedly under the codename N401. A non-AR version will follow in the footsteps of hardware like Meta's RayBan partnership, which can handle calls, photography, and works with a digital assistant.

For Apple's non-AR glasses, it is still working on the possibility of using cameras on the headwear to scan the environment. With cameras potentially being added to the Apple Watch and to AirPods, this could help expand the external awareness of Apple's AI.

Chips being made for the AirPods to handle this functionality are allegedly named "Glennie," while a counterpart for the Apple Watch is apparently "Nevis."

A big chip departure



The design of custom chips for smart glasses makes sense, especially when you consider Apple's current chip landscape.

For the Apple Vision Pro, Apple uses a dual-chip design, with the headset chiefly using the M2 for handling application processing needs. The R1, which accompanies it, is similarly high-powered and handles processing input from the plethora of onboard cameras, as well as handling positioning data and motion tracking, at an extremely high speed with minimal latency.

This is all great when you're dealing with a massive amount of processing in a not exactly small package. But, it becomes a problem when you are talking about smart glasses.

Smart glasses is a product category that relies on being in a small spectacles-like frame, with little space available for components and a need to minimize bulk. That means any chip design has to be made as small as can fit onto a typical pair of glasses.

Then there's the problem of available resources, as it will be considerably lower than a VR headset, simply to maintain the aesthetics of being spectacles. With reduced available battery life and no active cooling opportunities, that severely constrains a chip's potential.

One way around this is to offload the processing to another device, such as an iPhone kept in the user's pocket. An iPhone has a greater potential for processing, due to having a much larger battery and could feasibly remain cooler for longer, leaving the on-glasses chips for other local tasks.


Rumor Score: Likely

Read on AppleInsider

Comments

  • Reply 1 of 10
    9secondkox29secondkox2 Posts: 3,432member
    Glasses are the only way to go. 

    However, AR and non-AR doesn’t make sense. 

    It needs to be what it is and not convoluted the market. They can do that later on after the thing catches traction. 

    Kind of like the phone. It was first just the iPhone. It did everything sn iPhone can do. Later, it became the base model and the pro, etc. 

    The headset was never going to be a huge deal. Glasses have thr potential to be really big, especially if they look like a slick, high end pair of glasses/sunglasses - even if they partner with Oakley, etc. 
    edited May 8
    twolf2919
     1Like 0Dislikes 0Informatives
  • Reply 2 of 10
    twolf2919twolf2919 Posts: 173member
    "Even after years of development, augmented reality is still impractical to offer in smart glasses." - that's absurd.  If Google was able to come up with useful (albeit mocked) Google Glass AR glasses a dozen years ago, surely Apple can do that and better, given there've been a dozen years of technological improvements and miniaturization since then!

    From what I've read, I think the basic mistake Apple is making is to try and come up with a standalone AR Glass product, rather than making it a companion product to the iPhone.  By doing so, these glasses just have to do way too much given the space/weight limitations of glasses.  More CPU power needed;  more RAM;  much more  battery.  As a companion product it would simply pass camera/sensor data on to the phone and display whatever the phone tells it to display
    edited May 8
    williamlondonsurgefilterAlex1Ndanox
     2Likes 2Dislikes 0Informatives
  • Reply 3 of 10
    Basic common sense to let the ultra powerful iPhones do all the heavy-lifting and Bluetooth them to the glasses. That way the frames can be ultra slim, metal or like regular glasses instead of all of the current manufacturers going down the JOE 90 route.
    Alex1Ndanoxtiredskills
     1Like 2Dislikes 0Informatives
  • Reply 4 of 10
    9secondkox29secondkox2 Posts: 3,432member
    Basic common sense to let the ultra powerful iPhones do all the heavy-lifting and Bluetooth them to the glasses. That way the frames can be ultra slim, metal or like regular glasses instead of all of the current manufacturers going down the JOE 90 route.
    That solves a lot of issues while also not cannibslizing the iPhone. 
    Alex1Ndanox
     1Like 1Dislike 0Informatives
  • Reply 5 of 10
    beowulfschmidtbeowulfschmidt Posts: 2,410member
    Optimistic renders of what Apple Glass could look like - Image Credit: AppleInsider

    Read on AppleInsider

    I suspect that's very optimistic.


    danoxwilliamlondon
     1Like 1Dislike 0Informatives
  • Reply 6 of 10
    CarmBcarmb Posts: 114member
    If Apple is working towards standalone smart glasses, that makes sense. Yet, it's clear that such a device is many years away. Current technology is simply incapable of delivering such a product in an appealing form. Smart glasses tethered to an iPhone, on the other hand, is much more viable, using currently available tech. Surely the rational approach is launch a tethered version while concurrently exploring a standalone version as a future upgrade. Considering the installed base of iPhones. surely having to own an iPhone to make use of the glasses is hardly an impediment. 
    Alex1Ndanox
     2Likes 0Dislikes 0Informatives
  • Reply 7 of 10
    danoxdanox Posts: 3,737member
    CarmB said:
    If Apple is working towards standalone smart glasses, that makes sense. Yet, it's clear that such a device is many years away. Current technology is simply incapable of delivering such a product in an appealing form. Smart glasses tethered to an iPhone, on the other hand, is much more viable, using currently available tech. Surely the rational approach is launch a tethered version while concurrently exploring a standalone version as a future upgrade. Considering the installed base of iPhones. surely having to own an iPhone to make use of the glasses is hardly an impediment. 

    Apple took a small step forward with a new in-house C1 modem, and even that is not enough Apple glasses are years away everything has to be miniaturized but at the same time deliver even better functionality and greater computing power than the current Apple Vision. (years away like M15 and C10 away)

    At least Apple doesn’t have to worry about Intel, Meta, Google, Nvidia or Microsoft it’s far beyond their pay grade. However Qualcomm maybe the only one if they can get Microsoft (OS) on board.
    edited May 10
    neoncat
     0Likes 1Dislike 0Informatives
  • Reply 8 of 10
    charlesncharlesn Posts: 1,453member
    Here's the latest of what Meta is up to with its RayBan glasses--and honestly, WHO could be surprised:

    "Meta recently sent an email to Ray-Ban Meta users that said, in part, "Meta AI with camera use is always enabled on your glasses unless you turn off ‘Hey Meta,'” and “the option to disable voice recordings storage is no longer available.” Basically, Meta is vowing to look at what I'm looking at and store whatever I say, so you could argue there are some pretty big privacy concerns."

    NOBODY I know wants cameras and microphones pointed at them by some idiot wearing what should properly be called "Incel Glasses." In terms of social interaction, they are anything but "smart," and the original "glasshole" name applied to wearers of Google Glass that debuted a dozen years ago still very much applies. I cannot imagine Apple getting into such a socially repulsive business. 
    williamlondon
     1Like 0Dislikes 0Informatives
  • Reply 9 of 10
    tiredskillstiredskills Posts: 109member
    Basic common sense to let the ultra powerful iPhones do all the heavy-lifting and Bluetooth them to the glasses. That way the frames can be ultra slim, metal or like regular glasses instead of all of the current manufacturers going down the JOE 90 route.
    Lol, "basic common sense" to use a protocol with bandwidth measured in Mbps for so much visual data.  Good one.
     0Likes 0Dislikes 0Informatives
  • Reply 10 of 10
    mattinozmattinoz Posts: 2,622member
    charlesn said:
    Here's the latest of what Meta is up to with its RayBan glasses--and honestly, WHO could be surprised:

    "Meta recently sent an email to Ray-Ban Meta users that said, in part, "Meta AI with camera use is always enabled on your glasses unless you turn off ‘Hey Meta,'” and “the option to disable voice recordings storage is no longer available.” Basically, Meta is vowing to look at what I'm looking at and store whatever I say, so you could argue there are some pretty big privacy concerns."

    NOBODY I know wants cameras and microphones pointed at them by some idiot wearing what should properly be called "Incel Glasses." In terms of social interaction, they are anything but "smart," and the original "glasshole" name applied to wearers of Google Glass that debuted a dozen years ago still very much applies. I cannot imagine Apple getting into such a socially repulsive business. 
    Volcel not incel. We don’t want to deny them their agency in the lack human connection they make.
    charlesn
     1Like 0Dislikes 0Informatives
Sign In or Register to comment.