Apple's 'iPhone 8' to feature rear-facing 3D laser for AR and faster autofocus, report say...

Posted:
in iPhone edited July 2017
Apple is reportedly working to implement a specialized rear-facing laser system in its upcoming "iPhone 8" that will facilitate augmented reality applications, like those produced with ARKit, as well as faster and more accurate autofocus capabilities.

'iPhone 8' concept rendering by Marek Weidlich.


Citing a source familiar with Apple's plans, Fast Company reports the company is developing a VCSEL laser system for integration in a new iPhone model set for debut this fall. The rumored "iPhone 8," which is expected to become Apple's next flagship smartphone, is a likely candidate for deployment, the source said.

Building on past rumors regarding a front-facing 3D-sensing camera, today's report claims Apple will apply VCSEL technology to the rear-facing shooter. The system, which calculates distance to a target using light pulses and time of flight (TOF) measurements, would allow for extremely accurate depth mapping, a plus for AR applications.

Currently, Apple's ARKit relies on complex algorithms derived from optical information provided by iPhone's iSight camera.

Analyst Ming-Chi Kuo revealed Apple's intent to embed a 3D scanning subsystem into iPhone's front-facing FaceTime camera array in February. That system also integrates an infrared VCSEL transmitter and specialized receiver alongside the traditional color RGB camera module. Judging by today's report, Apple wants to do the same for iSight.

Fast Company says the rear-facing laser will also aid in faster and more accurate autofocus capabilities. Similar systems have been employed in digital SLRs and compact cameras for years, but have only recently made their way to small form factor devices like smartphones.

Apple has in the past relied on focus technology that comes with camera modules provided by third-party suppliers like Sony. Most recently, the company added phase shift autofocus, dubbed "Focus Pixels" in Apple speak, to iPhone with the iPhone 6 series in 2014.

Phase detection systems achieve focus by detecting and comparing two or more sets of incoming incident light rays. Laser systems, on the other hand, directly measure scene depth by measuring the time it takes a laser light pulse to travel to and from a target object.

As for suppliers, Apple has tapped Lumentum to provide most of the VCSEL lasers, with the remainder to be produced by Finisar and II-VI, the source said. The time of flight sensor is expected to come from STMicro, Infineon or AMS. As it does with other major operating components, Apple could purchase the part in module form from LG Innotek, STMicro, AMS or Foxconn, the report said.

While it would be a boon for ARKit, Apple's first official push into the AR space, the supposed 3D sensor might not be ready in time for a 2017 launch. Engineers are currently working to integrate the component, but inclusion is not guaranteed for "iPhone 8," the source implied.
«1

Comments

  • Reply 1 of 25
    Rayz2016Rayz2016 Posts: 6,957member
    So what you're saying is that Apple is working on a new totally 3D laser system, for a phone due in about ten weeks?
    watto_cobra
  • Reply 2 of 25
    I would say that this new phone is already done and gone for. I don't believe a word of rumors the say that engineers are struggling As We Speak to add the latest an gratest to the phone. There are probably a few million lying on shelfs already. 
    longpathwatto_cobraSpamSandwich
  • Reply 3 of 25
    A ) It's not a totally new 3D laser system, there are similar modules already in circulation. More likely, it's a version of the 3D sensor rumored for inclusion in February.

    B ) We have no idea when iPhone 8 will launch, engineers could be working on new EVTs ahead of a launch window in October (or November...or December). 
    edited July 2017 cali2old4fun
  • Reply 4 of 25
    kevin keekevin kee Posts: 1,289member
    Probably late news. Replace 2017 with 2016, and these news are more likely. Apple is working on iPhone 9 now.
    1983palegolaswatto_cobra
  • Reply 5 of 25
    gmacgmac Posts: 79member
    This one I gotta see!


    radarthekatcyberzombiewatto_cobraStrangeDays
  • Reply 6 of 25
    LordeHawkLordeHawk Posts: 168member
    Three primary options here.

    First, this is only a rumor or something on next years cyclical upgrades.

    Second, this is already on the new flagship phone, we're just hearing about it now.

    Finally, it's possible this is being implemented for a late arrival flagship phone.
    I work in a prototyping industry and while it seems impossible for a September launch, a later launch is possible.  We design and fabricate around the idea Build to Order BTO, like a computer manufacturer.  We lock up design elements that can no longer change (i.e. Battery placement & size, screen, etc) and then finalize last minute details or prefabricated internals from suppliers.  Apple does this with custom computer orders and it wouldn't be difficult if you have everything lined up with additional vendors and fallback plans.  If Apple knew 4 months ago this was a realistic option by launch, they would've designed the chassis and connectors to include cutouts and alerted vendors of new ICB layouts.  Consider also that the internal ICBs are also being designed by Apple and their suppliers in this manner, creating a dynamic and flexible supply chain.  If a new flagship arrives, remember it will be in severely constrained quantities anyway.  Three suppliers are more than enough to handle components for a limited new product that Apple's been planning for a while now.
    I would bet money that final assembly would be the real bottle neck.  New products have many unique parts and processes that are spread across multiple assembly lines.  One cannot simple take workers from iPhone 7 production lines and expect them to just figure it out.  New equipment, processes, and training must be iterated through till you reach production requirements.  During new product launches, many problems will be discovered and rectified in realtime, some of which may alter core processes.
    Tim Cooks true genius is his background developing this infinitely complex ballet while under Steve Jobs leadership.  Apple's modern day magic is just as much about producing an array of products with such tight production tolerances, as it is about the actual products themselves.

    In my honest opinion, no other company has performed this well for so long on such a titanic endeavor.  I'm humbled by each tech company, no matter how big or small, these products are truly a labor of love.  We are living in amazing times!
    radarthekatpatchythepirategregoriusmfastasleep2old4fun
  • Reply 7 of 25
    JinTechJinTech Posts: 1,020member
    I would say that this new phone is already done and gone for. I don't believe a word of rumors the say that engineers are struggling As We Speak to add the latest an gratest to the phone. There are probably a few million lying on shelfs already. 
    I doubt they're on shelves. These will ship with iOS 11 no doubt. 
  • Reply 8 of 25
    radarthekatradarthekat Posts: 3,842moderator
    The story is less dramatic than it seems, I'm confident.  ARKit is designed to work with iPhones going back to the 6S, and so the main focus will be to ensure the majority of ARKit capabilities function using the existing means of determining distance, that being the current Focus Pixels tech/algorithms, which the iPhone 8 will surely also support in its hardware.  Any scrambling among Apple's engineering teams would be to add support in ARKit for the new laser pulse TOF hardware, allowing iPhone 8 AR apps to be even more responsive than when run on other iPhones.  Apple will have the luxury of time to accomplish this, just like they did with Portrait mode last year.  The hardware will be there, but the full support may simply roll out in an OS update a bit later.  
    edited July 2017 fastasleepwatto_cobra
  • Reply 9 of 25
    calicali Posts: 3,494member
    Why does the comment section pretend they know what the next iPhone looks like and what features it has?

    It's not out yet.

    this is what's annoying about rumors. Everyone knows the future except when the future comes, everyone forgets and no one is called out.
    watto_cobra
  • Reply 10 of 25
    The story is less dramatic than it seems, I'm confident.  ARKit is designed to work with iPhones going back to the 6S, and so the main focus will be to ensure the majority of ARKit capabilities function using the existing means of determining distance, that being the current Focus Pixels tech/algorithms, which the iPhone 8 will surely also support in its hardware.  Any scrambling among Apple's engineering teams would be to add support in ARKit for the new laser pulse TOF hardware, allowing iPhone 8 AR apps to be even more responsive than when run on other iPhones.  Apple will have the luxury of time to accomplish this, just like they did with Portrait mode last year.  The hardware will be there, but the full support may simply roll out in an OS update a bit later.  
    I agree.

    When Apple introduced the gyroscope and compass into the iPhone 3GS, they did it in such a way that these new features were effectively woven into the existing iOS APIs for detecting the orientation of the screen. On older models, this feature was only supported by the accelerometers, but with the additional data from the compass and gyroscopes, it was highly accurate on the 3GS. But Apple's core APIs effectively evened-out the platform for developers.

    I have no doubt Apple is taking a similar strategy with ARKit. You can calculate scene depth using a single optical camera by focus evaluation, but you can also use stereoscopic scene analysis on dual lens devices, and soon, VCSEL lasers, to improve depth sensing. It would work across a wide spectrum of iOS devices, but they might have a few end-user AR tricks held in reserve for the iPhone 8. The great thing about Apple's software features is that the Chinese supply chain has no access to that, so they can still surprise us at the keynote!
    gregoriusmradarthekat2old4funStrangeDays
  • Reply 11 of 25
    fastasleepfastasleep Posts: 6,408member
    JinTech said:
    I would say that this new phone is already done and gone for. I don't believe a word of rumors the say that engineers are struggling As We Speak to add the latest an gratest to the phone. There are probably a few million lying on shelfs already. 
    I doubt they're on shelves. These will ship with iOS 11 no doubt. 
    You know they don't, like, seal iOS inside the enclosure when they're assembled.
    watto_cobra
  • Reply 12 of 25
    dick applebaumdick applebaum Posts: 12,527member
    The story is less dramatic than it seems, I'm confident.  ARKit is designed to work with iPhones going back to the 6S, and so the main focus will be to ensure the majority of ARKit capabilities function using the existing means of determining distance, that being the current Focus Pixels tech/algorithms, which the iPhone 8 will surely also support in its hardware.  Any scrambling among Apple's engineering teams would be to add support in ARKit for the new laser pulse TOF hardware, allowing iPhone 8 AR apps to be even more responsive than when run on other iPhones.  Apple will have the luxury of time to accomplish this, just like they did with Portrait mode last year.  The hardware will be there, but the full support may simply roll out in an OS update a bit later.  
    I agree.

    When Apple introduced the gyroscope and compass into the iPhone 3GS, they did it in such a way that these new features were effectively woven into the existing iOS APIs for detecting the orientation of the screen. On older models, this feature was only supported by the accelerometers, but with the additional data from the compass and gyroscopes, it was highly accurate on the 3GS. But Apple's core APIs effectively evened-out the platform for developers.

    I have no doubt Apple is taking a similar strategy with ARKit. You can calculate scene depth using a single optical camera by focus evaluation, but you can also use stereoscopic scene analysis on dual lens devices, and soon, VCSEL lasers, to improve depth sensing. It would work across a wide spectrum of iOS devices, but they might have a few end-user AR tricks held in reserve for the iPhone 8. The great thing about Apple's software features is that the Chinese supply chain has no access to that, so they can still surprise us at the keynote!
    That’s exactly how Apple rolls!  

    ARKit works well on prior iPad Pros... Works better on newer iPad Pros by exploiting advanced hardware sensors... and will work even better on new iDevices to come.

    It’s Apple’s Secret Sauce:
    • implement in software
    • migrate to hardware
    • exploit in new software
    • support both
    • rinse and repeat



    radarthekatwatto_cobraStrangeDays
  • Reply 13 of 25
    palegolaspalegolas Posts: 1,361member
    LordeHawk said:
    Three primary options here.
    ....
    In my honest opinion, no other company has performed this well for so long on such a titanic endeavor.  I'm humbled by each tech company, no matter how big or small, these products are truly a labor of love.  We are living in amazing times!
    Amazing times indeed!
    Fourth option: It's for the AR shades, or Apple car, and the rumours connected it to the wrong product.
    watto_cobra
  • Reply 14 of 25
    dick applebaumdick applebaum Posts: 12,527member
    Analyst Ming-Chi Kuo revealed Apple's intent to embed a 3D scanning subsystem into iPhone's front-facing FaceTime camera array in February. That system also integrates an infrared VCSEL transmitter and specialized receiver alongside the traditional color RGB camera module. Judging by today's report, Apple wants to do the same for iSight.

    Mmm...

    If we have 3D cameras/sensors in both the front-facing and rear-facing cameras, developers could go crazy with ARKit apps by running both cameras at the same time...

    ...the ultimate Selfie – the user’s virtual image appears among friends’ real (and/or virtual) images placed in a virtual scene – say on a roller coaster...

    You, otta’ be in pictures... ♪♫♬


    edited July 2017 firelockradarthekat
  • Reply 15 of 25
    foggyhillfoggyhill Posts: 4,767member
    Analyst Ming-Chi Kuo revealed Apple's intent to embed a 3D scanning subsystem into iPhone's front-facing FaceTime camera array in February. That system also integrates an infrared VCSEL transmitter and specialized receiver alongside the traditional color RGB camera module. Judging by today's report, Apple wants to do the same for iSight.

    Mmm...

    If we have 3D cameras/sensors in both the front-facing and rear-facing cameras, developers could go crazy with ARKit apps by running both cameras at the same time...

    ...the ultimate Selfie – the user’s virtual image appears among friends’ real (and/or virtual) images placed in a virtual scene – say on a roller coaster...

    You, otta’ be in pictures... ♪♫♬


    Make it so... (says Picard).

    Yeah, AR is about to hit the big time and people are still saying "apple is falling behind" (sic). They make me laugh so much.

    And they'll have no answer cause 95% of phones won't have the capacity and software to run such thing even if Android/Google decided to up its game.

    There only needs to be glasses added for Apple to have a whole AR ecosystem (Airpods/Watch/Phone) and even the Homepods (to create soundscape) could all work together.
    tmayStrangeDays
  • Reply 16 of 25
    slprescottslprescott Posts: 765member
    foggyhill said:
    There only needs to be glasses added for Apple to have a whole AR ecosystem (Airpods/Watch/Phone) and even the Homepods (to create soundscape) could all work together.

    Yes, this article makes me wonder about the rumored AR glasses that Apple is developing. Would they have a similar laser embedded for depth detection?
  • Reply 17 of 25
    brucemcbrucemc Posts: 1,541member
    foggyhill said:
    Analyst Ming-Chi Kuo revealed Apple's intent to embed a 3D scanning subsystem into iPhone's front-facing FaceTime camera array in February. That system also integrates an infrared VCSEL transmitter and specialized receiver alongside the traditional color RGB camera module. Judging by today's report, Apple wants to do the same for iSight.

    Mmm...
    If we have 3D cameras/sensors in both the front-facing and rear-facing cameras, developers could go crazy with ARKit apps by running both cameras at the same time...
    ...the ultimate Selfie – the user’s virtual image appears among friends’ real (and/or virtual) images placed in a virtual scene – say on a roller coaster...
    You, otta’ be in pictures... ♪♫♬
    Make it so... (says Picard).
    Yeah, AR is about to hit the big time and people are still saying "apple is falling behind" (sic). They make me laugh so much.
    And they'll have no answer cause 95% of phones won't have the capacity and software to run such thing even if Android/Google decided to up its game.
    There only needs to be glasses added for Apple to have a whole AR ecosystem (Airpods/Watch/Phone) and even the Homepods (to create soundscape) could all work together.
    Yes, it would seem that Apple has more of the "pieces" than anyone else to pull together an AR (and VR) solution.  It seems to slip under the radar of most of the media though.  There are plenty of articles around the ARKit demos being posted, and everyone thinks they are great, but no one seems to say "and now we see that Apple actually has a lead here...".  Just that the demos are awesome, and show what can be done with AR - often the only mention of Apple is "done with Apple's ARKit".
  • Reply 18 of 25
    I would say that this new phone is already done and gone for. I don't believe a word of rumors the say that engineers are struggling As We Speak to add the latest an gratest to the phone. There are probably a few million lying on shelfs already. 
    Articles like that are for the gullible, who might actually believe such tabloid garbage. The iPhone 8, whatever it consists of, was finalized and in the can months ago. At this stage they are just working on software bugs, and even those would be minor at this stage.


  • Reply 19 of 25
    radarthekatradarthekat Posts: 3,842moderator
    foggyhill said:
    There only needs to be glasses added for Apple to have a whole AR ecosystem (Airpods/Watch/Phone) and even the Homepods (to create soundscape) could all work together.

    Yes, this article makes me wonder about the rumored AR glasses that Apple is developing. Would they have a similar laser embedded for depth detection?
    Maybe, but glasses could have steroscopic vision to provide adequate depth perception and distance to objects.  By having cameras separated by the width of the frames.  
  • Reply 20 of 25
    dick applebaumdick applebaum Posts: 12,527member
    brucemc said:
    foggyhill said:
    Analyst Ming-Chi Kuo revealed Apple's intent to embed a 3D scanning subsystem into iPhone's front-facing FaceTime camera array in February. That system also integrates an infrared VCSEL transmitter and specialized receiver alongside the traditional color RGB camera module. Judging by today's report, Apple wants to do the same for iSight.

    Mmm...
    If we have 3D cameras/sensors in both the front-facing and rear-facing cameras, developers could go crazy with ARKit apps by running both cameras at the same time...
    ...the ultimate Selfie – the user’s virtual image appears among friends’ real (and/or virtual) images placed in a virtual scene – say on a roller coaster...
    You, otta’ be in pictures... ♪♫♬
    Make it so... (says Picard).
    Yeah, AR is about to hit the big time and people are still saying "apple is falling behind" (sic). They make me laugh so much.
    And they'll have no answer cause 95% of phones won't have the capacity and software to run such thing even if Android/Google decided to up its game.
    There only needs to be glasses added for Apple to have a whole AR ecosystem (Airpods/Watch/Phone) and even the Homepods (to create soundscape) could all work together.
    Yes, it would seem that Apple has more of the "pieces" than anyone else to pull together an AR (and VR) solution.  It seems to slip under the radar of most of the media though.  There are plenty of articles around the ARKit demos being posted, and everyone thinks they are great, but no one seems to say "and now we see that Apple actually has a lead here...".  Just that the demos are awesome, and show what can be done with AR - often the only mention of Apple is "done with Apple's ARKit".


    Just that the demos are awesome, and show what can be done with AR - often the only mention of Apple is "done with Apple's ARKit" "can only be done with Apple hardware and Apple software”.

Sign In or Register to comment.