Apple AR headset will use Face ID tech for hand gesture tracking, says Kuo
Analyst Ming-Chi Kuo now claims that the forthcoming Apple AR headset will track hand movements, which will allow for the device to interpret gestures.

In his second Apple AR report in as many days, respected analyst Ming-Chi Kuo is reporting more details of the sensors expected to be in the first of Apple's forthcoming headsets.
"Gesture control and object detection are critical human-machine UI designs of Apple's AR/MR headset," writes Kuo in a note for investors. "Apple's AR/MR headset is equipped with more 3D sensing modules than iPhones."
"We predict that Apple's AR/MR headset will have four sets of 3D sensing (vs. one to two sets for iPhone/high-end smartphones)," he continues. "We predict that the structured light of the AR/MR headset can detect not only the position change of the user or other people's hand and object in front of the user's eyes but also the dynamic detail change of the hand."
Saying that this "dynamic detail change" is similar to how Face ID can detect changes of expression, Kuo claims that capturing "the details of hand movement can provide a more intuitive and vivid human-machine UI."
There is a difference between Face ID and this hand tracking, though.
"Although both adopt structured light, the distance between the hand (user or other people's) and the object detected by the headset device needs to be longer than that seen by iPhone's Face ID," continues Kuo, "so the structured light power consumption of the headset device is higher."
"We predict that the detection distance of Apple's AR/MR headset with structured light is 100-200% farther than the detection distance of the iPhone Face ID," he says. "To increase the field of view (FOV) for gesture detection, we predict that the Apple AR/MR headset will be equipped with three sets of ToFs [Time of Flight] to detect hand movement trajectories with low latency requirements."
Kuo's focus is on the impact these components and requirements will have on companies that his clients may choose to invest in. So he reports that the firms WIN semi and Lumentum are believed to be the main providers for the headset's VCSELs (Vertical Cavity Surface-Emitting Laser) sensors.
However, he also backs up his previous report that Apple ultimately intends the AR headset to replace the iPhone.
"The innovative human-machine UI for headset devices requires the integration of many technologies," he writes. "It will be critical for headsets to replace the existing consumer electronic products with displays in the next ten years."
Read on AppleInsider

In his second Apple AR report in as many days, respected analyst Ming-Chi Kuo is reporting more details of the sensors expected to be in the first of Apple's forthcoming headsets.
"Gesture control and object detection are critical human-machine UI designs of Apple's AR/MR headset," writes Kuo in a note for investors. "Apple's AR/MR headset is equipped with more 3D sensing modules than iPhones."
"We predict that Apple's AR/MR headset will have four sets of 3D sensing (vs. one to two sets for iPhone/high-end smartphones)," he continues. "We predict that the structured light of the AR/MR headset can detect not only the position change of the user or other people's hand and object in front of the user's eyes but also the dynamic detail change of the hand."
Saying that this "dynamic detail change" is similar to how Face ID can detect changes of expression, Kuo claims that capturing "the details of hand movement can provide a more intuitive and vivid human-machine UI."
There is a difference between Face ID and this hand tracking, though.
"Although both adopt structured light, the distance between the hand (user or other people's) and the object detected by the headset device needs to be longer than that seen by iPhone's Face ID," continues Kuo, "so the structured light power consumption of the headset device is higher."
"We predict that the detection distance of Apple's AR/MR headset with structured light is 100-200% farther than the detection distance of the iPhone Face ID," he says. "To increase the field of view (FOV) for gesture detection, we predict that the Apple AR/MR headset will be equipped with three sets of ToFs [Time of Flight] to detect hand movement trajectories with low latency requirements."
Kuo's focus is on the impact these components and requirements will have on companies that his clients may choose to invest in. So he reports that the firms WIN semi and Lumentum are believed to be the main providers for the headset's VCSELs (Vertical Cavity Surface-Emitting Laser) sensors.
However, he also backs up his previous report that Apple ultimately intends the AR headset to replace the iPhone.
"The innovative human-machine UI for headset devices requires the integration of many technologies," he writes. "It will be critical for headsets to replace the existing consumer electronic products with displays in the next ten years."
Read on AppleInsider
Comments
Currently VR headsets are so big and heavy that they become uncomfortable to wear after an hour or two.
Even if an AR headset were smaller, it would still need to be powered by batteries. Where do they go? In your pocket with a cord dangling down your neck?
You can shrink the vision system but what about the rest of the electronics? It all generates heat so how to you dissipate it if the electronics are built into a pair of glasses?
AR is great but for the best virtual experiences, you need full VR. Will the AR glasses have a full VR mode or will we need another headset for that?
Many people are very sensitive to differences between motion they see with their eyes and what they feel with their sense of balance. How do you do VR without making people sick?
Finally the cameras. You can't have AR without some cameras to help with the tracking and scene analysis. People threw a hissy fit over the original Google Glass AR glasses because they had cameras. There was at least one fight over them. Is everyone ready to accept a camera on everyone's face constantly recording and sending the video back to the Apple's servers? What happens when Apple starts scanning those videos for anything illegal to protect kids or something?
But AR / VR seems to me like it would struggle to be for group work or client interaction. I can see a good business for a WeWork type company to have group VR spaces.
I have worn both for my job which is being in the VR/AR industry. An issue both suffer from is very low field of view for projection (it’s like a stamp in the center of your glasses). I can only assume Apple solved that problem.
AR glasses work differently from VR glasses, which to your point use conventional monitors stuck in front of your eyes. I can take 30 minutes but then I’m completely done with them.
This is my objection to AR glasses; I got eye surgery because I hated to wear a frame on my nose all day, and my eyes couldn’t handle contact lenses nor did I wanted to fiddle with these things in my eyes. I just don’t want to wear something on my head other than sunglasses, so why should I go back to the situation I invested to leave?
My second objection is that it competes with the phone which already does so much and has an amazing display.
Hardware without great software in a wide range of categories diminished the net usefulness of it.