3D laser scanner possibly destined for Apple's 'iPhone 8' expected to add facial recogniti...
The 3D laser scanning module that may make an appearance on the "iPhone 8" raises a number of questions, and a new report attempts to shed some light on the subject, including what model of iPhone it might appear on this year.
According to Rod Hall from investment firm J.P. Morgan, the sensor expected to be used in this year's "iPhone 8" is aimed at a practical face scanning utility, and not solely or specifically for augmented reality application.
Apple's Touch ID sensor has the potential of a 1 in 50,000 error rate. In its earliest implementation, Hall believes a laser-based 3D sensor, versus an infrared one, will be far more accurate, boosting the already very secure Touch ID.
At first glance, the technology described resembles a miniaturized version of a LIDAR mapper or rangefinder.
At present, at macro scale, LIDAR is used in many physical sciences, and heavily used by the military in ordnance guidance suites. Resolution and accuracy greatly depends on a combination of factors, not the least of which is the integration of the laser and the sensor -- which Apple's acquisition of Primesense appears intended for.
Adding weight to the LIDAR comparison, a presentation in December by machine learning head Russ Salakhutdinov covered Apple's artificial intelligence research in "volumetric detection of LIDAR," which could be utilized in an iPhone, as well as in a larger platform.
Obviously, an iPhone won't have the power necessary to blanket a large geographical area with a laser for mapping, nor the processing power to deal with the data. However, a smaller volume such as that occupied by a user is easily covered by a very low power emitter.
While more complex, the stack occupies less volume in the phone, and is arguably more resistant to damage. However, optical diffraction can be a problem, as can lower yield rates in production, potentially driving the cost up, or being a supply-limiting factor in getting devices equipped with the stack to retail.
The technology has more use than facial recognition. A potential API release, which is probably not likely early in the hardware's life, could open up other uses like ultimate use in augmented reality and virtual reality headsets, clothing sizing, accurate distance measurements for home improvement, scanning for 3D printing, appliance and HomeKit integration, and other applications needing precise volumetric scans, or range-finding needs.
"Creating small, inexpensive, 3D scanning modules has interesting application far beyond smartphones. Eventually we believe these sensors are likely to appear on auto-driving platforms as well as in many other use cases," writes Hall. "Maybe a fridge that can tell what types of food you have stored in it isn't as far off as you thought."
While the "iPhone 8" is expected to be laden with all of Apple's newest technologies -- including an OLED wrap-around display, fingerprint sensor embedded in the screen glass, and others -- the 3D sensing technology may not be limited to just the high end. Hall believes that there is "more unit volume" for production of the sensor sandwich, opening up the possibilities that it will appear on the "iPhone 7s" series, in addition to the expected $1,000+ high-end "iPhone 8."
Apple purchased Israeli-based PrimeSense in lat 2013, for around $360 million. PrimeSense created sensors, specialized silicon and middle-ware for motion sensing and 3D scanning applications, and was heavily involved in the development of Microsoft's Kinect sensor technology before the deal.
According to Rod Hall from investment firm J.P. Morgan, the sensor expected to be used in this year's "iPhone 8" is aimed at a practical face scanning utility, and not solely or specifically for augmented reality application.
Apple's Touch ID sensor has the potential of a 1 in 50,000 error rate. In its earliest implementation, Hall believes a laser-based 3D sensor, versus an infrared one, will be far more accurate, boosting the already very secure Touch ID.
Basics of the technology
A 3D sensor needs a few key features -- it needs a light emitter in the form of a LED or laser diode, a light filter for a noise-free scan, and a speedy image sensor coupled with a powerful signal processor. Each aspect has some implementation difficulties, which Hall believes have been rectified by Apple and partner Primesense in a relatively inexpensive package that should add not much more than 3 percent to the build cost of an iPhone.At first glance, the technology described resembles a miniaturized version of a LIDAR mapper or rangefinder.
At present, at macro scale, LIDAR is used in many physical sciences, and heavily used by the military in ordnance guidance suites. Resolution and accuracy greatly depends on a combination of factors, not the least of which is the integration of the laser and the sensor -- which Apple's acquisition of Primesense appears intended for.
Adding weight to the LIDAR comparison, a presentation in December by machine learning head Russ Salakhutdinov covered Apple's artificial intelligence research in "volumetric detection of LIDAR," which could be utilized in an iPhone, as well as in a larger platform.
Obviously, an iPhone won't have the power necessary to blanket a large geographical area with a laser for mapping, nor the processing power to deal with the data. However, a smaller volume such as that occupied by a user is easily covered by a very low power emitter.
Miniaturized hardware
Any 3D laser scanning apparatus in a smartphone would likely be introduced in a wafer-level "stacked" optical column, rather than a conventional barrel lens.While more complex, the stack occupies less volume in the phone, and is arguably more resistant to damage. However, optical diffraction can be a problem, as can lower yield rates in production, potentially driving the cost up, or being a supply-limiting factor in getting devices equipped with the stack to retail.
The technology has more use than facial recognition. A potential API release, which is probably not likely early in the hardware's life, could open up other uses like ultimate use in augmented reality and virtual reality headsets, clothing sizing, accurate distance measurements for home improvement, scanning for 3D printing, appliance and HomeKit integration, and other applications needing precise volumetric scans, or range-finding needs.
Outside Cupertino
Given's Apple's ownership of Primesense intellectual property, the company stands to gain from other industries implementation of the technology as well."Creating small, inexpensive, 3D scanning modules has interesting application far beyond smartphones. Eventually we believe these sensors are likely to appear on auto-driving platforms as well as in many other use cases," writes Hall. "Maybe a fridge that can tell what types of food you have stored in it isn't as far off as you thought."
While the "iPhone 8" is expected to be laden with all of Apple's newest technologies -- including an OLED wrap-around display, fingerprint sensor embedded in the screen glass, and others -- the 3D sensing technology may not be limited to just the high end. Hall believes that there is "more unit volume" for production of the sensor sandwich, opening up the possibilities that it will appear on the "iPhone 7s" series, in addition to the expected $1,000+ high-end "iPhone 8."
Apple purchased Israeli-based PrimeSense in lat 2013, for around $360 million. PrimeSense created sensors, specialized silicon and middle-ware for motion sensing and 3D scanning applications, and was heavily involved in the development of Microsoft's Kinect sensor technology before the deal.
Comments
AR with an API, either internal to Apple or later for developers, could be integrated into many existing applications, and these rumors of a LIDAR module would play into that. Photography and Video would both benefit from LIDAR.
As an aside, Apple should seriously look at acquiring Nikon for its optical and lithographic equipment expertise, hopefully saving the Professional camera brand from it's current r&D and marketing woes.
Yes. Depending on how fast a "scene" can be rendered, you may not need a second camera to determine depth of field. Ideally, every pixel within an image could not only contain color information, but also distance.
Currently, using stereo optics, depth of field is determine by the amount of "shift" in the image. The more shift, the closer that area of the image is to the camera. If you know the viewing angle of the cameras and the distance between the cameras, you can determine the distance to the object by how much it shifts in the image.
https://en.m.wikipedia.org/wiki/Mosquito_laser
iPhone X makes sense, though I see the model as a special edition without a follow up, thus I wouldn't be surprised if it were called the new iPhone SE. Apple may just make SEs from time to time, with their bread&butter iPhone 7s carrying the linage and nomenclature forward, not their special edition.
I really hope we'll see substantially improved camera performance on the next iPhone. I recently spent some time playing with a friend's Galaxy S8 Plus and, frankly, it's camera put my iPhone 6S to shame. Autofocus was dramatically faster, image quality was noticeably superior, particularly in low light, and the camera featured excellent optical image stabilization... all without the need for huge and unsightly camera bump.
Meanwhile, its AMOLED display was much brighter and more vibrant than the iPhones, while managing to display normal looking skin tones without messing with the color settings.