Kuo: Apple unlikely to integrate rear-facing 3D sensor in 2019 iPhone
Contrary to industry expectations, Apple analyst Ming-Chi Kuo believes the company will not integrate a rear-side time of flight (TOF) solution in its 2019 iPhone lineup, saying the technology is not yet ready for the augmented reality revolution.
In a note to investors seen by AppleInsider, Ming-Chi Kuo says industry analysts expect Apple to incorporate rear-side TOF as it looks to develop next-generation augmented reality experiences. For example, Apple is thought to be developing an AR version of Apple Maps, potentially for use with a rumored AR headset.
According to Kuo, the distance and depth information provided by existing rear-side TOF hardware is insufficient for creating the "revolutionary AR experience" that Apple is presumably working toward.
The analyst believes a comprehensive AR ecosystem is one that integrates 5G connectivity, AR glass (a wearable, head-mounted device) and a "more powerful Apple Maps database" that includes appropriate distance and depth information. It appears Kuo, like others, assumes Apple Maps will be marketed as a "killer app" for Apple's next-gen AR experience.
Additionally, TOF tech does not improve photo taking functionality, a major consideration for a company that touts its handsets as the best portable cameras in the world.
As such, Kuo says Apple will likely forego rear-side TOF in 2019, instead relying on a dual-camera system first introduced with iPhone 7 Plus in 2016.
"We believe that iPhone's dual-camera can simulate and offer enough distance/depth information necessary for photo-taking; it is therefore unnecessary for the 2H19 new iPhone models to be equipped with a rear-side ToF," Kuo says.
Rumors of a rear-facing TrueDepth-style camera date back to last July, when reports claimed Apple planned to debut a rear-facing VCSEL system for AR applications and faster camera autofocus. That solution was due to arrive in what would become iPhone X, but Apple's flagship smartphone uses a single VCSEL module in its front-facing TrueDepth camera array.
Unlike TrueDepth, which measures distortion in structured light, a TOF system calculates the time it takes pulses of light to travel to and from a target. Such systems allow for extremely accurate depth mapping and can therefore assist in AR applications.
The July rumor was followed by a second report in November claiming much the same, while analysts jumped on the bandwagon in February.
In a note to investors seen by AppleInsider, Ming-Chi Kuo says industry analysts expect Apple to incorporate rear-side TOF as it looks to develop next-generation augmented reality experiences. For example, Apple is thought to be developing an AR version of Apple Maps, potentially for use with a rumored AR headset.
According to Kuo, the distance and depth information provided by existing rear-side TOF hardware is insufficient for creating the "revolutionary AR experience" that Apple is presumably working toward.
The analyst believes a comprehensive AR ecosystem is one that integrates 5G connectivity, AR glass (a wearable, head-mounted device) and a "more powerful Apple Maps database" that includes appropriate distance and depth information. It appears Kuo, like others, assumes Apple Maps will be marketed as a "killer app" for Apple's next-gen AR experience.
Additionally, TOF tech does not improve photo taking functionality, a major consideration for a company that touts its handsets as the best portable cameras in the world.
As such, Kuo says Apple will likely forego rear-side TOF in 2019, instead relying on a dual-camera system first introduced with iPhone 7 Plus in 2016.
"We believe that iPhone's dual-camera can simulate and offer enough distance/depth information necessary for photo-taking; it is therefore unnecessary for the 2H19 new iPhone models to be equipped with a rear-side ToF," Kuo says.
Rumors of a rear-facing TrueDepth-style camera date back to last July, when reports claimed Apple planned to debut a rear-facing VCSEL system for AR applications and faster camera autofocus. That solution was due to arrive in what would become iPhone X, but Apple's flagship smartphone uses a single VCSEL module in its front-facing TrueDepth camera array.
Unlike TrueDepth, which measures distortion in structured light, a TOF system calculates the time it takes pulses of light to travel to and from a target. Such systems allow for extremely accurate depth mapping and can therefore assist in AR applications.
The July rumor was followed by a second report in November claiming much the same, while analysts jumped on the bandwagon in February.
Comments
IMHO none.
These solutions normally have a primary consideration. Distance. Front facing Face ID was designed with short range in mind.
Rear facing depth sensing would need to cover longer distances.
Ming-Chi is that you ? Hahaha lighten up a bit.
Also getting info from sneaky inside sources isn’t exactly prognosticating.
Kuo has some sources in the supply chain, he gets info on what is being made. But he’s often wrong on pricing, names, strategy, etc...all the stuff a leaker doesn’t know.
2) One of the reasons I'm here is for what I already stated in my comment: I can easily point to a dozen commenters on this forum that I'd rather get opinions about what Apple will likely do in the near future.