What you need to know about Apple's LiDAR Scanner in the iPad Pro

Posted:
in iPad edited March 2020
Apple on Wednesday unveiled two new iPad Pro models that come equipped with a LiDAR Scanner, which will offer major improvements to ARKit and photography.

Apple's new LiDAR Scanner will offer major improvements to augmented reality.
Apple's new LiDAR Scanner will offer major improvements to augmented reality.


The new 11- and 12.9-inch iPad Pro models are the first of Apple's devices to feature the 3D laser system, but they likely won't be the last. Here's what you need to know about LiDAR, how it improves current iPad Pro models, and what other future Apple devices could feature it.

What is LiDAR?

At the most basic level, LiDAR is a time-of-flight system that shoots low-power lasers at an environment. Using the reflections, it calculates the distance to objects and points in the environment, and creates an accurate 3D depth map or rendering based on the results.

Apple's own proprietary take on it, simply dubbed the LiDAR Scanner, likely has a few more tricks up its sleeve. Apple says it can measure the distance to surrounding objects up to 5 meters away and operates "at the photon level at nano-second speeds."

The iPad Pro LiDAR Scanner is used to create depth mapping points that, when combined with camera and motion sensor data, can create a "more detailed understanding of a scene" according to Apple.

What could a LiDAR Scanner be used for?

Both first- and third-party apps will be able to take advantage of more accurate depth mapping.
Both first- and third-party apps will be able to take advantage of more accurate depth mapping.


Among Apple's existing features, LiDAR will have the biggest impact on augmented reality (AR) and Apple's own ARKit framework. Apple says the new LiDAR Scanner will allow for instant object placement, indicating that users wouldn't need to "scan" their environment before an AR app loads.

Along with improvements to motion capture and people occlusion, the LiDAR Scanner will also make the Measure app much faster and more accurate. Measure will now include a new Ruler View for more granular measurements going forward, too.

While Apple didn't specifically mention it, LiDAR will improve photography too. Take Portrait Mode, which the 2018 iPad Pro only supported in front-facing mode. With an actual 3D depth map of an area instead of using lens-based calculations to determine depth, Apple could add rear-facing Portrait Mode to the iPad Pro and improve the feature's accuracy and speed.

Is the LiDAR coming to other devices?

Apple's LiDAR scanner has launched first on the new 11- and 12.9-inch iPad Pro, as was previously rumored. But the system is also largely expected to arrive on some 2020 iPhones, too.

The latest information, pulled from code within an iOS 14 leak, suggests that a time-of-flight camera will arrive on both the "iPhone 12 Pro" and the "iPhone 12 Pro Max" this year.

On those devices, a LiDAR Scanner will also bring the same improvements to ARKit apps and photography. But combined with Ultra Wide Band technology, it may also be useful in applications such as indoor navigation and item tracking.

LiDAR for vehicular applications




LiDAR is a new addition to Apple's handheld devices, but the Cupertino tech company has actually been actively using them for years in other applications. Apple vehicles with LiDAR sensors have been spotted in California as far back as 2015. The technology is considered a crucial part of the development of autonomous vehicles, particularly so they can accurately analyze their environment.

Amid rumors of Project Titan and the "Apple Car," the company appears to be steadily investing into LiDAR and other related research for vehicular applications, including a slew of patent applications related to the tech.

And in a rare public-facing example of its research, Apple also published a research paper in 2017 detailing LiDAR-based 3D object recognition systems for self-driving cars. Essentially, the system leverages the depth mapping of LiDAR and combines it with neural networks to vastly improve the ability of a self-driving car to "see" its environment and potential hazards.

Comments

  • Reply 1 of 14
    When it comes to the iPhone, it could be really useful to the vision impaired, will be interesting to see it in action.
    watto_cobrakitatittmay
  • Reply 2 of 14
    knowitallknowitall Posts: 1,648member
    I don’t seem to remember a remark Musk made about lidar ...
    watto_cobra
  • Reply 3 of 14
    rob53rob53 Posts: 3,251member
    So nothing about 3D capture for modeling or printing. Hopefully this will come later. 
    cy_starkmanwatto_cobrakitatit
  • Reply 4 of 14
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    rob53 said:
    So nothing about 3D capture for modeling or printing. Hopefully this will come later. 
    There's no reason why it can't come. The camera has the capability, and developers have full access to the API.
    StrangeDayswatto_cobrafastasleep
  • Reply 5 of 14
    knowitall said:
    I don’t seem to remember a remark Musk made about lidar ...
    Sure he did.  

    I’m pretty sure he didn’t have in mind anything but the spinning LiDAR sensors you find in cars, though. 

    His comment about Lidar was analogous to Jobs’ comments about 2007 smartphones requiring a stylus.  (If you see a stylus, it’s already a fail...something along those lines)
    cy_starkmanwatto_cobra
  • Reply 6 of 14
    Seems almost small enough to fit in a pair of glasses..
    watto_cobraEsquireCatschasm
  • Reply 7 of 14
    fastasleepfastasleep Posts: 6,417member
    rob53 said:
    So nothing about 3D capture for modeling or printing. Hopefully this will come later. 
    You've already been able to do this for years with the Face ID cameras on iPhone and iPad Pro:

    https://apps.apple.com/us/app/capture-3d-scan-anything/id1444183458

    There are several others as well, just go search for 3D scan in the App Store. These probably will need to be updated for the new rear cameras on this iPad and the upcoming iPhone, but that's probably trivial. This is going to be SO much easier as using the Face ID camera means facing the display towards whatever you're scanning, which is awkward to say the least. 
  • Reply 8 of 14
    knowitallknowitall Posts: 1,648member
    knowitall said:
    I don’t seem to remember a remark Musk made about lidar ...
    Sure he did.  

    I’m pretty sure he didn’t have in mind anything but the spinning LiDAR sensors you find in cars, though. 

    His comment about Lidar was analogous to Jobs’ comments about 2007 smartphones requiring a stylus.  (If you see a stylus, it’s already a fail...something along those lines)
    Aha, now I remember.
  • Reply 9 of 14
    Apple says it can measure the distance to surrounding objects up to 5 meters away and operates "at the photon level at nano-second speeds."
    Gods, I love marketing.  You mean like literally every other scanning system that uses electromagnetic radiation to scan, including your eyes? :lol:  

    And please don't take this as disparaging the system itself, this is probably the year I actually get my own iPad instead of my wife's hand me downs, I just had to laugh when I saw that bit.
    edited March 2020 gatorguyCloudTalkindewmejdb8167
  • Reply 10 of 14
    EsquireCatsEsquireCats Posts: 1,268member
    Apple says it can measure the distance to surrounding objects up to 5 meters away and operates "at the photon level at nano-second speeds."
    Gods, I love marketing.  You mean like literally every other scanning system that uses electromagnetic radiation to scan, including your eyes? :lol:  

    And please don't take this as disparaging the system itself, this is probably the year I actually get my own iPad instead of my wife's hand me downs, I just had to laugh when I saw that bit.
    Hmm I wouldn't go that far, as scanning systems are quite different from one another (and yes of course all use EM) - the feat with the iPad is that it fits into a small device.
    However, I'm not really sure what Apple are saying when they use the words "at the photon level", is this to describe the 3D scanning resolution at the sensor, I would be very curious to learn about the scanning resolution possible. (The demo video showing the CAD app showed a reasonably useful level of scanning density.)
  • Reply 11 of 14
    dewmedewme Posts: 5,363member
    Apple says it can measure the distance to surrounding objects up to 5 meters away and operates "at the photon level at nano-second speeds."
    Gods, I love marketing.  You mean like literally every other scanning system that uses electromagnetic radiation to scan, including your eyes? :lol:  

    And please don't take this as disparaging the system itself, this is probably the year I actually get my own iPad instead of my wife's hand me downs, I just had to laugh when I saw that bit.
    Agreed. Just like the new fangled use of the term "time-of-flight" rather than (echo) ranging. Incidentally the "R" in SONAR, RADAR, and LiDAR is "ranging." Hey, whatever it takes to keep the relationship fresh.
    larryaCloudTalkinbeowulfschmidt
  • Reply 12 of 14
    Apple says it can measure the distance to surrounding objects up to 5 meters away and operates "at the photon level at nano-second speeds."
    Gods, I love marketing.  You mean like literally every other scanning system that uses electromagnetic radiation to scan, including your eyes? :lol:  

    And please don't take this as disparaging the system itself, this is probably the year I actually get my own iPad instead of my wife's hand me downs, I just had to laugh when I saw that bit.
    Hmm I wouldn't go that far, as scanning systems are quite different from one another (and yes of course all use EM) - the feat with the iPad is that it fits into a small device.
    However, I'm not really sure what Apple are saying when they use the words "at the photon level", is this to describe the 3D scanning resolution at the sensor, I would be very curious to learn about the scanning resolution possible. (The demo video showing the CAD app showed a reasonably useful level of scanning density.)
    If they're trying to imply that the "scanning resolution" of the sensor is "at the photon level", my comment stands.  Every single ranging or scanning system that uses EM radiation operates "at the photon level", because that's what EM radiation is, and "at nanosecond speeds", because that's how fast EM radiation travels.  LiDAR and your eyes both detect EM radiation, albeit at different wavelengths (the other difference being the speed of communication with the processor after the sensor detects the photons).

    Making statements like this is pure marketing, and makes me laugh.
  • Reply 13 of 14
    fastasleepfastasleep Posts: 6,417member
    Apple says it can measure the distance to surrounding objects up to 5 meters away and operates "at the photon level at nano-second speeds."
    Gods, I love marketing.  You mean like literally every other scanning system that uses electromagnetic radiation to scan, including your eyes? :lol:  

    And please don't take this as disparaging the system itself, this is probably the year I actually get my own iPad instead of my wife's hand me downs, I just had to laugh when I saw that bit.
    Hmm I wouldn't go that far, as scanning systems are quite different from one another (and yes of course all use EM) - the feat with the iPad is that it fits into a small device.
    However, I'm not really sure what Apple are saying when they use the words "at the photon level", is this to describe the 3D scanning resolution at the sensor, I would be very curious to learn about the scanning resolution possible. (The demo video showing the CAD app showed a reasonably useful level of scanning density.)
    If they're trying to imply that the "scanning resolution" of the sensor is "at the photon level", my comment stands.  Every single ranging or scanning system that uses EM radiation operates "at the photon level", because that's what EM radiation is, and "at nanosecond speeds", because that's how fast EM radiation travels.  LiDAR and your eyes both detect EM radiation, albeit at different wavelengths (the other difference being the speed of communication with the processor after the sensor detects the photons).

    Making statements like this is pure marketing, and makes me laugh.
    Thanks for explaining to everyone how marketing has always worked.
  • Reply 14 of 14
    Thanks for explaining to everyone how marketing has always worked.
    You're more than welcome.  Glad to have helped, in whatever small way.
    fastasleep
Sign In or Register to comment.