3D laser scanner possibly destined for Apple's 'iPhone 8' expected to add facial recogniti...

Posted:
in iPhone
The 3D laser scanning module that may make an appearance on the "iPhone 8" raises a number of questions, and a new report attempts to shed some light on the subject, including what model of iPhone it might appear on this year.




According to Rod Hall from investment firm J.P. Morgan, the sensor expected to be used in this year's "iPhone 8" is aimed at a practical face scanning utility, and not solely or specifically for augmented reality application.

Apple's Touch ID sensor has the potential of a 1 in 50,000 error rate. In its earliest implementation, Hall believes a laser-based 3D sensor, versus an infrared one, will be far more accurate, boosting the already very secure Touch ID.

Basics of the technology

A 3D sensor needs a few key features -- it needs a light emitter in the form of a LED or laser diode, a light filter for a noise-free scan, and a speedy image sensor coupled with a powerful signal processor. Each aspect has some implementation difficulties, which Hall believes have been rectified by Apple and partner Primesense in a relatively inexpensive package that should add not much more than 3 percent to the build cost of an iPhone.

At first glance, the technology described resembles a miniaturized version of a LIDAR mapper or rangefinder.

At present, at macro scale, LIDAR is used in many physical sciences, and heavily used by the military in ordnance guidance suites. Resolution and accuracy greatly depends on a combination of factors, not the least of which is the integration of the laser and the sensor -- which Apple's acquisition of Primesense appears intended for.




Adding weight to the LIDAR comparison, a presentation in December by machine learning head Russ Salakhutdinov covered Apple's artificial intelligence research in "volumetric detection of LIDAR," which could be utilized in an iPhone, as well as in a larger platform.

Obviously, an iPhone won't have the power necessary to blanket a large geographical area with a laser for mapping, nor the processing power to deal with the data. However, a smaller volume such as that occupied by a user is easily covered by a very low power emitter.

Miniaturized hardware

Any 3D laser scanning apparatus in a smartphone would likely be introduced in a wafer-level "stacked" optical column, rather than a conventional barrel lens.

While more complex, the stack occupies less volume in the phone, and is arguably more resistant to damage. However, optical diffraction can be a problem, as can lower yield rates in production, potentially driving the cost up, or being a supply-limiting factor in getting devices equipped with the stack to retail.




The technology has more use than facial recognition. A potential API release, which is probably not likely early in the hardware's life, could open up other uses like ultimate use in augmented reality and virtual reality headsets, clothing sizing, accurate distance measurements for home improvement, scanning for 3D printing, appliance and HomeKit integration, and other applications needing precise volumetric scans, or range-finding needs.

Outside Cupertino

Given's Apple's ownership of Primesense intellectual property, the company stands to gain from other industries implementation of the technology as well.

"Creating small, inexpensive, 3D scanning modules has interesting application far beyond smartphones. Eventually we believe these sensors are likely to appear on auto-driving platforms as well as in many other use cases," writes Hall. "Maybe a fridge that can tell what types of food you have stored in it isn't as far off as you thought."

While the "iPhone 8" is expected to be laden with all of Apple's newest technologies -- including an OLED wrap-around display, fingerprint sensor embedded in the screen glass, and others -- the 3D sensing technology may not be limited to just the high end. Hall believes that there is "more unit volume" for production of the sensor sandwich, opening up the possibilities that it will appear on the "iPhone 7s" series, in addition to the expected $1,000+ high-end "iPhone 8."

Apple purchased Israeli-based PrimeSense in lat 2013, for around $360 million. PrimeSense created sensors, specialized silicon and middle-ware for motion sensing and 3D scanning applications, and was heavily involved in the development of Microsoft's Kinect sensor technology before the deal.

Comments

  • Reply 1 of 14
    Might this also have enhanced camera applications?
    tmaydoozydozen
  • Reply 2 of 14
    tmaytmay Posts: 6,309member
    Might this also have enhanced camera applications?
    Apple is making the correct choice by focussing on AR first over VR. My own opinion is that VR is still, and will continue to be, niche in the performance/form factor that is available today, and the target audience is essentially gaming and research for the most part, albeit there is quite a bit of marketing in the first low cost releases to demonstrate potential. 

    AR with an API, either internal to Apple or later for developers, could be integrated into many existing applications, and these  rumors of a LIDAR module would play into that. Photography and Video would both benefit from LIDAR.

    As an aside, Apple should seriously look at acquiring Nikon for its optical and lithographic equipment expertise, hopefully saving the Professional camera brand from it's current r&D and marketing woes.
    doozydozenradarthekatwatto_cobra
  • Reply 3 of 14
    mjtomlinmjtomlin Posts: 2,673member
    Might this also have enhanced camera applications?

    Yes. Depending on how fast a "scene" can be rendered, you may not need a second camera to determine depth of field. Ideally, every pixel within an image could not only contain color information, but also distance. 

    Currently, using stereo optics, depth of field is determine by the amount of "shift" in the image. The more shift, the closer that area of the image is to the camera. If you know the viewing angle of the cameras and the distance between the cameras, you can determine the distance to the object by how much it shifts in the image.
    radarthekatwatto_cobra
  • Reply 4 of 14
    entropysentropys Posts: 4,152member
    Many years ago someone created a meme called "the Apple product cycle" which was really about the Apple rumours cycle. It went through the early rumour stage fuelled by obscure publications in Asia, the grand product release, the line ups on launch day, shortages, delays in shipping dates, and snarky commentary from media about how real soon the x killer will be produced by a competitor and thus Apple is doomed, etc.  We are clearly in that early stage where every theoretical and wondrous feature gets added to the yet to be revealed product.  The point of the story was not every rumour gets added to the wondrous new product, and the rumour addicts get disappointed. To much wailing and mashing of teeth.
    doozydozenwatto_cobrapscooter63
  • Reply 5 of 14
    This is cool, but what I really want in my next iPhone is this:

    https://en.m.wikipedia.org/wiki/Mosquito_laser
    tmayjustadcomics
  • Reply 6 of 14
    tmay said:
    Might this also have enhanced camera applications?
    Apple is making the correct choice by focussing on AR first over VR. My own opinion is that VR is still, and will continue to be, niche in the performance/form factor that is available today, and the target audience is essentially gaming and research for the most part, albeit there is quite a bit of marketing in the first low cost releases to demonstrate potential. 

    AR with an API, either internal to Apple or later for developers, could be integrated into many existing applications, and these  rumors of a LIDAR module would play into that. Photography and Video would both benefit from LIDAR.

    As an aside, Apple should seriously look at acquiring Nikon for its optical and lithographic equipment expertise, hopefully saving the Professional camera brand from it's current r&D and marketing woes.
    Not sure if Apple should acquire Nikon since Sony makes all the sensors for the Nikon cameras. On the flip side, an acquisition by Apple could benefit Nikon since they are a small company and don't have the resources like Sony and Canon do for camera r&d. Nikon does make the best lenses though so that expertise would benefit Apple. 
    edited February 2017
  • Reply 7 of 14
    When AR is becoming so advance, I wonder if iPhone 9 can detect cancer in your eyeball or to be able to identify a rash on your skin?
  • Reply 8 of 14
    sog35 said:
    Why are they calling this the iPhone8? 

    No way they call it that and release an iPhone 7s.

    Its going to be called the iPhoneX (not ten but 'exe')

    next year iPhone X1
    Then iPhone X2
    I agree it makes little sense calling it and iPhone 8, as if it were the successor to the iP7s, which may launch in tandem.

    iPhone X makes sense, though I see the model as a special edition without a follow up, thus I wouldn't be surprised if it were called the new iPhone SE. Apple may just make SEs from time to time, with their bread&butter iPhone 7s carrying the linage and nomenclature forward, not their special edition. 
  • Reply 9 of 14
    This is cool, but what I really want in my next iPhone is this:

    https://en.m.wikipedia.org/wiki/Mosquito_laser
    A "mosquito laser" would be cool... Mosquito drones with frickin' lasers on their heads would be even cooler.
    justadcomics
  • Reply 10 of 14
    ...and nare a peep about the potential privacy implications of this...? Will anyone be able to 'opt out' of being scanned in public? Is this, like cloning, and perhaps AI, defining a new category of 'human' rights...?
  • Reply 11 of 14
    tmaytmay Posts: 6,309member
    This is cool, but what I really want in my next iPhone is this:

    https://en.m.wikipedia.org/wiki/Mosquito_laser
    A "mosquito laser" would be cool... Mosquito drones with frickin' lasers on their heads would be even cooler.
    An AntMan suit, that's what I want, with claws and lasers, to kill my own mosquitos.
    SpamSandwichjustadcomics
  • Reply 12 of 14
    kevin keekevin kee Posts: 1,289member
    This is cool, but what I really want in my next iPhone is this:

    https://en.m.wikipedia.org/wiki/Mosquito_laser
    A "mosquito laser" would be cool... Mosquito drones with frickin' lasers on their heads would be even cooler.
    I thought you were saying a laser to kill all of those pesky android phones. Never mind - I thought it was such a super cool idea too.
  • Reply 13 of 14
    freediverxfreediverx Posts: 1,423member
    mjtomlin said:
    Might this also have enhanced camera applications?

    Yes. Depending on how fast a "scene" can be rendered, you may not need a second camera to determine depth of field. Ideally, every pixel within an image could not only contain color information, but also distance. 

    Currently, using stereo optics, depth of field is determine by the amount of "shift" in the image. The more shift, the closer that area of the image is to the camera. If you know the viewing angle of the cameras and the distance between the cameras, you can determine the distance to the object by how much it shifts in the image.

    I really hope we'll see substantially improved camera performance on the next iPhone. I recently spent some time playing with a friend's Galaxy S8 Plus and, frankly, it's camera put my iPhone 6S to shame. Autofocus was dramatically faster, image quality was noticeably superior, particularly in low light, and the camera featured excellent optical image stabilization... all without the need for huge and unsightly camera bump.

    Meanwhile, its AMOLED display was much brighter and more vibrant than the iPhones, while managing to display normal looking skin tones without messing with the color settings.


    edited June 2017
Sign In or Register to comment.