Apple AR headset, new Mac Pro and more expected in 2022
Apple's 2022 product may include the company's next big thing, with a new Mac Pro, Apple's AR and VR headset, and an iPad Pro with wireless charging all expected to debut.

As 2021 draws to a close, the rumor mill turns towards 2022, offering expectations of what Apple will reveal and show to the public in the year ahead. According to one report, 2022 is the year that Apple will finally introduce its AR headset.
In Sunday's "Power On" newsletter for Bloomberg, Mark Gurman proposes what he believes Apple will be bringing out in 2022, following what he describes as a "modest year for Apple product releases."
Among the new products brought up by Gurman is Apple's AR headset, an often-rumored piece of hardware. While Gurman doesn't really break new ground with the report, he does seemingly give credence to other claims about the device.
"Gaming should be a strong focus of the machine," he says, before mentioning its use of multiple processors, a cooling system, high-resolution displays, and its own App Store. "Look for Apple to position the device as a dream for game developers."
Apple is also anticipated to work with media partners to create VR media, and to include a "VR FaceTime-like experience" using the headset.
Gurman's other expected product launches seems to be quite pedestrian and guessable, such as updates to the iPad line including a renewed iPad Pro with wireless charging. In June, reports pointed to a "glass sandwich" design for the iPad Pro that would enable wireless charging to occur.
There are also suggestions about an iPhone SE with 5G connectivity, the usual iPhone 14 refresh, three Apple Watch models including a new SE and a ruggedized version, and revised AirPods Pro. On the Mac side, a major MacBook Air revamp with M2 is expected, along with updates to the Mac mini, a larger iMac, and a Mac Pro with Apple Silicon.
Read on AppleInsider

As 2021 draws to a close, the rumor mill turns towards 2022, offering expectations of what Apple will reveal and show to the public in the year ahead. According to one report, 2022 is the year that Apple will finally introduce its AR headset.
In Sunday's "Power On" newsletter for Bloomberg, Mark Gurman proposes what he believes Apple will be bringing out in 2022, following what he describes as a "modest year for Apple product releases."
Among the new products brought up by Gurman is Apple's AR headset, an often-rumored piece of hardware. While Gurman doesn't really break new ground with the report, he does seemingly give credence to other claims about the device.
"Gaming should be a strong focus of the machine," he says, before mentioning its use of multiple processors, a cooling system, high-resolution displays, and its own App Store. "Look for Apple to position the device as a dream for game developers."
Apple is also anticipated to work with media partners to create VR media, and to include a "VR FaceTime-like experience" using the headset.
Gurman's other expected product launches seems to be quite pedestrian and guessable, such as updates to the iPad line including a renewed iPad Pro with wireless charging. In June, reports pointed to a "glass sandwich" design for the iPad Pro that would enable wireless charging to occur.
There are also suggestions about an iPhone SE with 5G connectivity, the usual iPhone 14 refresh, three Apple Watch models including a new SE and a ruggedized version, and revised AirPods Pro. On the Mac side, a major MacBook Air revamp with M2 is expected, along with updates to the Mac mini, a larger iMac, and a Mac Pro with Apple Silicon.
Read on AppleInsider
Comments
Either way, Apple needs to get its 3D graphics incredibly power efficient or its going nowhere.
Almost as if he doesn't know anything about what's going on.
For mainstream use, movies offer a big opportunity here. VR gaming is still quite a small market due to the hardware price but there is potential to expand it.
Offering people the ability to have a virtual cinema screen in their own home has a huge appeal and it's 3D. Disney movies are almost all CGI so they can have kids watching the movies and the characters can hop out of the screen and walk around them. With LIDAR tracking on the front, they can track the wearer's hands so they could make a movie like Peter Pan and the Tinkerbell character flies out of the screen and lands on the hands of one of the kids.
Horror movies can have hands grabbing at the viewer from over their shoulder. Alien movies can fill the floor with fog or have the alien crawling over the ceiling.
Combined with spatial audio, it can offer a very immersive experience and with Apple TV+, Apple has the means to distribute movies that use its capabilities best.
For better clarity in bright light and a VR mode, they can use dynamic tinting with electrochromic glass, shown on the following page:
https://www.tuvie.com/dusk-electrochromic-smart-sunglasses-with-dynamic-lenses/
Holographic Facetime is a possibility, this would need the glasses to scan the wearer's face or articulate a face from voice. This has some interesting possibilities as shown in the following video:
You would effectively see a person in the same room. This could be used for remote education and would be a lot more effective trying to do a remote classroom. Remote fitness classes would be better with a full body accessory scanner, possibly an iPhone.
For some people it could replace using an iPad or iPhone like students taking notes in classes. They can have a hardware keyboard and be typing but using the view in the glasses instead of an iPad screen.
and games and Apple?
However I want to know more about the 2022 iPad Pro. Sure wireless charging and all glass, everyone is predicting that. But is it getting an M2? Is it coming out in the spring or the fall?
Ready Player One level VR ubiquity is far yet into the future, long after old codgers like me are gone. AR is my bet and hope.
2022 will probably (hopefully) be focused on getting a decent iMac and Mac Mini out the door. Doubt we will be seeing much of a decent Mac Pro yet, maybe a semi-pro model with connected M2's so it's somewhat affordable and then the big one in 2023.
Your overall points about the technology are valid, and I don’t disagree with them, but let us pay proper respects to human eyesight.
Human eyesight is a fantastically incredible mechanism with so much “processing” and “features” that us physicians can only being to fathom it the most basic parts of it. The eyeball is only a small part of human eyesight. It lets the light in, and does some simple analog tricks to separate out frequency and different fields of your vision. It does this by refracting parts of the image onto parts of the retina, with different receptors (many lay people hear about rods and cones, but that’s the tip of the iceberg for “processing” in the eyeball). The information then gets essential converted to digital. There are several layers that the image overlays, and different nerve fibers carry this off to different sectors, criss-crossing being the eyes like some complex freeway exchange, and racing through so many ancillary processing systems on their “optic” processing that boggles the mind. This images can activate thousands of different brain responses before the “seeing” processing in the back part of the brain even takes place, depending on what is coming in through the eye. While I refer to this as an image, the processing utilizes far more information taken in through the eye than what you may think of as an “image.” As this information is being processed, it is analyzed for a fantastic amount of information before an image is produced, best we can tell. These centers can identify basic threats to safety, process your position in 3D space, feeds realtime movement information to your balance centers and separately to your multiple threat systems, depending on what “pre-processing” has so far taking from the input. All this happens in the more basic centers far before you “see” the images. There are far too functions here to even brush the surface. There are many, many thick tomes on the topic that cause sleepless nights to medical students, specialized neurologists, scientists/researchers, and the like. Those are just basic processing features. Then, when your brain sends a process image to your higher brain, the spectrum of possibilities of further processing and responses bloom almost into the infinite. All this happens in the tiniest fraction of time before you even get the chance to consciously register the image. Eyesight processes longitudinally over time as well, not just “snapshots.” Eyesight is processing of fluid data streaming in continuously, not just “frames.” These things are really remarkable, and we barely understand enough to even know that they are there to research. Human eyesight can process something you have NEVER seen before and stratify a threat level and cross categorize it using so many variables that you can react reasonably to it within a fraction of a second without having a damned clue what it is.
Computerized “vision” is mostly made up of really cool, but really simple tricks that do things like shift wavelengths or simple magnification that allow us to see things that we would not normal register. This shifted light information is then dumped into the eye for the real processing. Identification of objects using “machine learning” is another really amazing trick, but isn’t one-millionth as incredible as human eyesight and processing. The computer’s “sight” is extremely fundamental, and an algorithm essentially pattern-matches something in an image to a known database. These mechanisms are getting very sophisticated, and they seem like magic to some people.
While I LOVE the things that AR are bringing our way, I would argue that computer vision has not exceeded the ability of human eyesight, and probably never will. What it will do is augment our vision by dropping some extra data into the eyeball that human eyesight will process. I think that we will see all the things you discussed coming into play in the very near future, and I’ll be one of the first to get on board, especially if it’s an Apple product that I know will be very well supported.
What Nussbaum did was pay attention to how images look best on the transparent LCD, and then optimizing the image capture side with a matching white background. The slight shadow behind the subject adds an extra illusion of depth, but make no mistake--it is not a 3D hologram by any stretch. The effect looks very good IRL, but these videos and photos of the experience make it look much better than it really is. He deserves props for his attention to detail, but it is not as groundbreaking as it seems.
https://forums.appleinsider.com/discussion/comment/3345881/#Comment_3345881
I think the closest product to what Apple would make is the Nreal AR glasses:
https://forums.appleinsider.com/discussion/comment/3348410/#Comment_3348410
That product is on the market already and they discovered 78% of users were watching movies with them and others were developing software with them. They have designed lighter models without the motion tracking for watching movies:
https://www.theverge.com/2021/9/30/22700782/nreal-air-smart-ar-glasses-release-date-shipping-countries
LG U+ is bundling a model of these with a smartphone as it needs a smartphone to work:
https://www.koreatechtoday.com/lg-uplus-nreal-to-launch-5g-powered-ar-glasses-u-real-glass/
This reminds me of the LG Prada fullscreen touch phone that came out before the iPhone. They clearly had an idea of what to make but no idea of how to make it good. This has a bulky cable connected to the phone and it uses the phone for pointing at things in 3D. It uses plastic clip-ons and the one for VR-mode covers the tracking cameras. This is what Apple says about for every yes there's a thousand no's, these companies never learn. It's easy to imagine an Apple implementation of that kind of wearable and how much better it would be.
The following video shows some views through the glasses. Ideally the image quality would be closer to what's shown at 0:28 than 0:32, which is like a projector screen and this can be done in VR-mode by blocking incoming light behind but would be best done at a more fine-grained level. It does a good job of showing what can be done with over 200" virtual displays anywhere and you could leave virtual displays all around the house and they'd stay in place so a cooking timer and recipe book can sit next to the cooker, a calendar and clock can sit on the wall, a TV can sit in the lounge and bedroom and they just appear in view as you walk around the house. It makes the computer conform to the user's environment.
Exactly, vertical integration is what Apple does best. They make the platform end-to-end and now have content distribution. Other manufacturers have to play around with 3rd party operating systems and components and don't have the scale to drive 3rd party software development. Apple can have their own arOS and their hardware like the LiDAR scanner is hard for others to get in that form factor.
For Apple's use case, the full body scanning and projection is more important, the glasses would be the projector instead of the giant vending machine. TED talks, lecturers, webcam performers etc can broadcast to a global audience in 3D with just an iPhone/iPad on one end and AR glasses on the other. Even real-time CGI faces are becoming very realistic and this can either be used Memoji-style or to fill in areas that a projection scanner misses.
https://www.theverge.com/22819963/snap-ar-spectacles-glasses-hands-on-pictures-design-features
That shows it's possible to put the processing and battery in a compact standalone model but the battery life is 30 minutes, partly due to the 2000 nits display to show up in bright light. If glasses can operate on a low power mode when standalone like 0.5W average, they could push a standalone unit to about 4 hours of runtime. Then it would just need a battery tether possibly with a magsafe connection. The Airpods Pro case is 2Wh. That would need to be around 10Wh for AR. Someone could wear that like a necklace/holster/hip attachment and a single wire at the back connects to the glasses for longer battery life. People can buy multiple cases and swap them for all-day battery life.
If they could use a different kind of battery that was fast charging, low capacity, high endurance like a solid state battery, it wouldn't matter so much having short battery life. The standalone runtime would be best around 2 hours+ to work for movies then 2-5 minute charge time.