Second 2020 iPad teardown shows how LiDAR differs from Face ID

Posted:
in iPad edited August 2020
A second teardown of the new 12.9-inch iPad Pro has shown the insides of the latest model are quite similar to that of the previous incarnation, while a demonstration of the LiDAR addition reveals it isn't going to offer the same level of sensitivity as the TrueDepth camera array.

The rear camera module removed from the 2020 iPad Pro (via iFixit)
The rear camera module removed from the 2020 iPad Pro (via iFixit)


The second within the space of a few days, the latest customary teardown of the 2020 iPad Pro by iFixit is an unusual affair, as it is released primarily on video. Recorded under a lockdown caused by the coronavirus, the teardown is brief, but follows the same procedure as the repair outfit's earlier publications, but with some differences.

Separating the display from the rest of the iPad relied on guitar picks and a hairdryer, rather than the use of a warming pad to release adhesives. As with the 2018 iPad Pro, the rear cover has to be twisted for the removal of cables and screwed-on shields to fully free it.





The new camera module separates via a few screws, with a 10-megapixel ultra-wide module along with a 12-megapixel wide camera, and the LiDAR scanner, which as previously revealed is made up of two lens-capped modules stacked on top of each other. It is speculated the modules consist of a VCSEL transmitter and a receiving sensor, with the former putting out an array of infrared dots that are picked up by the sensor.

By using an infrared camera, the teardown found the LiDAR system emits a regular pattern of dots, considerably fewer than used by the TrueDepth camera. As it isn't meant for Face ID-style applications, it seems this iteration is just for more simplified depth mapping over a wider range, instead of requiring finer measurements of a face.

Infrared dot projections for the LiDAR module (left), TrueDepth camera (right) (via iFixit)
Infrared dot projections for the LiDAR module (left), TrueDepth camera (right) (via iFixit)


The front-facing cameras are removed in a single assembly, with similar hardware as the previous module. The USB-C port is still modular at the base, rather than a hard-wired component, making it a prime candidate for easy repairs.

The logic board is, as typical for iPads, glued to the inside with wires running underneath it, and is flanked by the batteries. On the board are the A12Z Bionic chip with 6GB of RAM, up from 4GB in the previous models.

Batteries are held in place with stretch-release adhesive, though regular adhesive is still used for some areas, which still makes it tough to replace them. The two cells have a total capacity of 36.59 watt-hours, the same as the model it is replacing.

In iFixit's summing up of the new model, it is deemed to have "pretty abysmal repair procedures" despite the incremental upgrades for users. The use of adhesives and precarious prying leads to a "repairability" score of just 3 out of 10.

Comments

  • Reply 1 of 16
    SoliSoli Posts: 10,035member
    On the board are the A12Z Bionic chip with 6GB of RAM, up from 4GB in the previous models.

    Is that confirmed to be every new iPad Pro? I thought it was only the model with maximum storage.
    watto_cobra
  • Reply 2 of 16
    Phobos7Phobos7 Posts: 63member
    True Depth was the better choice.
    Beats
  • Reply 3 of 16
    chasmchasm Posts: 3,291member
    Phobos7 said:
    True Depth was the better choice.
    Truedepth isn't really needed for things like measuring the depth of a room, so why add (probably considerable) cost to something that doesn't need a finely-detailed 3D map? Tables and walls and furniture just aren't as complex as faces. Take a look at the Apple Maps "look around" to see a demonstration of more complex LiDAR.
    SoliStrangeDaysbrometheuscaladanianwatto_cobra
  • Reply 4 of 16
    apple ][apple ][ Posts: 9,233member
    Soli said:
    On the board are the A12Z Bionic chip with 6GB of RAM, up from 4GB in the previous models.

    Is that confirmed to be every new iPad Pro? I thought it was only the model with maximum storage.
    That was the case with the previous model.

    Supposedly, every model of this new iPad Pro has 6 GB RAM.
    SolicaladanianGeorgeBMacwatto_cobra
  • Reply 5 of 16
    BeatsBeats Posts: 3,073member
    chasm said:
    Phobos7 said:
    True Depth was the better choice.
    Truedepth isn't really needed for things like measuring the depth of a room, so why add (probably considerable) cost to something that doesn't need a finely-detailed 3D map? Tables and walls and furniture just aren't as complex as faces. Take a look at the Apple Maps "look around" to see a demonstration of more complex LiDAR.

    So he's correct. If Truedepth is indeed better than we can be sure it will be added to later models as it compliments AR.
    cornchip
  • Reply 6 of 16
    SoliSoli Posts: 10,035member
    Beats said:
    chasm said:
    Phobos7 said:
    True Depth was the better choice.
    Truedepth isn't really needed for things like measuring the depth of a room, so why add (probably considerable) cost to something that doesn't need a finely-detailed 3D map? Tables and walls and furniture just aren't as complex as faces. Take a look at the Apple Maps "look around" to see a demonstration of more complex LiDAR.
    So he's correct. If Truedepth is indeed better than we can be sure it will be added to later models as it compliments AR.
    Maybe not. TrueDepth might be better for details when looking at a predisposed object that it's expecting to verify with a map and at a close, predetermined range expectation (well within an arm's length), compared to another use of lidar* that is designed for mapping countless objects in significantly larger area (a room). Kind of like having a macro lens for taking a picture of a mountain range and then arguing that that macro lens is better because it can get higher detail of a specific object.


    * I keep seeing LiDAR spelled this way. Just stick with lidar like we have for radar, laser and the uncommon maser (even though it predates laser and follows its pattern).
    edited March 2020 StrangeDayswatto_cobra
  • Reply 7 of 16
    Phobos7Phobos7 Posts: 63member
    chasm said:
    Phobos7 said:
    True Depth was the better choice.
    Truedepth isn't really needed for things like measuring the depth of a room, so why add (probably considerable) cost to something that doesn't need a finely-detailed 3D map? Tables and walls and furniture just aren't as complex as faces. Take a look at the Apple Maps "look around" to see a demonstration of more complex LiDAR.
    Perhaps more complex but I’m after accuracy. From what I gathered from the example provided Lidar doesn’t seem to offer better recognition but rather quicker. And the price point is not my first concern when purchasing an iPad or an iPhone. With that said, this is really the first major test of Lidor on anything Apple. At this point, everything is experimental anyway, we’ll see which is selected for permanence as it may not be either.
    edited March 2020 Beatscornchipwatto_cobra
  • Reply 8 of 16
    StrangeDaysStrangeDays Posts: 12,874member
    Phobos7 said:
    True Depth was the better choice.
    What does that even mean? They have two different purposes. The lidar tech has much better range and is designed for spatial environments. TD has a shallow range and higher fidelity for facial details. If you slapped a TD sensor on the back it wouldn’t be able to scan a room nearly as well. 
    edited March 2020 watto_cobra
  • Reply 9 of 16
    BeatsBeats Posts: 3,073member
    Soli said:
    Beats said:
    chasm said:
    Phobos7 said:
    True Depth was the better choice.
    Truedepth isn't really needed for things like measuring the depth of a room, so why add (probably considerable) cost to something that doesn't need a finely-detailed 3D map? Tables and walls and furniture just aren't as complex as faces. Take a look at the Apple Maps "look around" to see a demonstration of more complex LiDAR.
    So he's correct. If Truedepth is indeed better than we can be sure it will be added to later models as it compliments AR.
    Maybe not. TrueDepth might be better for details when looking at a predisposed object that it's expecting to verify with a map and at a close, predetermined range expectation (well within an arm's length), compared to another use of lidar* that is designed for mapping countless objects in significantly larger area (a room). Kind of like having a macro lens for taking a picture of a mountain range and then arguing that that macro lens is better because it can get higher detail of a specific object.


    * I keep seeing LiDAR spelled this way. Just stick with lidar like we have for radar, laser and the uncommon maser (even though it predates laser and follows its pattern).

    Probably because the lower case "i" has been all the rage since the early 2000s. I still cringe when I see knockoffs use it though.

    I'm still unsure what's stopping Apple from making their lidar system more accurate or Truedepth have wider range.

    Phobos7 said:
    chasm said:
    Phobos7 said:
    True Depth was the better choice.
    Truedepth isn't really needed for things like measuring the depth of a room, so why add (probably considerable) cost to something that doesn't need a finely-detailed 3D map? Tables and walls and furniture just aren't as complex as faces. Take a look at the Apple Maps "look around" to see a demonstration of more complex LiDAR.
    Perhaps more complex but I’m after accuracy. From what I gathered from the example provided Lidar doesn’t seem to offer better recognition but rather quicker. And the price point is not my first concern when purchasing an iPad or an iPhone. With that said, this is really the first major test of Lidor on anything Apple. At this point, everything is experimental anyway, we’ll see which is selected for permanence as it may not be either.

    I'm thinking in the future there may be Apple cameras that combine the accuracy of Truedepth and the range of Lidar.
    cornchip
  • Reply 10 of 16
    SoliSoli Posts: 10,035member
    Beats said:
    Soli said:
    Beats said:
    chasm said:
    Phobos7 said:
    True Depth was the better choice.
    Truedepth isn't really needed for things like measuring the depth of a room, so why add (probably considerable) cost to something that doesn't need a finely-detailed 3D map? Tables and walls and furniture just aren't as complex as faces. Take a look at the Apple Maps "look around" to see a demonstration of more complex LiDAR.
    So he's correct. If Truedepth is indeed better than we can be sure it will be added to later models as it compliments AR.
    Maybe not. TrueDepth might be better for details when looking at a predisposed object that it's expecting to verify with a map and at a close, predetermined range expectation (well within an arm's length), compared to another use of lidar* that is designed for mapping countless objects in significantly larger area (a room). Kind of like having a macro lens for taking a picture of a mountain range and then arguing that that macro lens is better because it can get higher detail of a specific object.


    * I keep seeing LiDAR spelled this way. Just stick with lidar like we have for radar, laser and the uncommon maser (even though it predates laser and follows its pattern).

    Probably because the lower case "i" has been all the rage since the early 2000s. I still cringe when I see knockoffs use it though.
    It's not an Apple thing. It's because the 'i' isn't it's own word, but part of the word light, just as you could've once found RaDAR as it uses the 'a' from the word Radio along with d(etection) a(nd) r(anging), for both acronyms.

    I'm still unsure what's stopping Apple from making their lidar system more accurate or Truedepth have wider range.

    Why are you assuming Apple is "stopping"? Clearly their years of work with lidar is being applied to devics to make them better, faster, and more accurate with every iteration. Because they didn't call it TrueDepth and it doesn't have the exact same feature set as lidar being used with other sensors for a very specific use case on the front of the camera? You don't want the system on the back to work for Face ID and you don't want Face ID to map a room. That should be clear.
    brometheusgatorguycornchipwatto_cobra
  • Reply 11 of 16
    GeorgeBMacGeorgeBMac Posts: 11,421member
    I have no problem with a repairability score of 3 of 10 for an iPad -- because it is designed for the highest possible level of mobility consistent with a large screen.  

    But, hopefully Apple uses the iPad's new found ability to use a cursor as a means of letting it take over some of the high mobility requirements of MacBooks and not only start making MacBooks more functional with additional ports, etc., but also more upgradeable and repairable.

    Let the two lines compliment each other rather than competing with each other.
  • Reply 12 of 16
    MplsPMplsP Posts: 3,921member
    apple ][ said:
    Soli said:
    On the board are the A12Z Bionic chip with 6GB of RAM, up from 4GB in the previous models.

    Is that confirmed to be every new iPad Pro? I thought it was only the model with maximum storage.
    That was the case with the previous model.

    Supposedly, every model of this new iPad Pro has 6 GB RAM.
    The RAM is integrated into the processor, right? Having different amounts of RAM would ultimately having different processors for different iPad models. it would also mean different performance. They certainly could do that but they've never done so in the past.
  • Reply 13 of 16
    MplsPMplsP Posts: 3,921member
    I have no problem with a repairability score of 3 of 10 for an iPad -- because it is designed for the highest possible level of mobility consistent with a large screen.  

    Perhaps, but things like batteries and screens that need to be replaced with some frequency should be replaceable without too much difficulty.
  • Reply 14 of 16
    SoliSoli Posts: 10,035member
    MplsP said:
    apple ][ said:
    Soli said:
    On the board are the A12Z Bionic chip with 6GB of RAM, up from 4GB in the previous models.

    Is that confirmed to be every new iPad Pro? I thought it was only the model with maximum storage.
    That was the case with the previous model.

    Supposedly, every model of this new iPad Pro has 6 GB RAM.
    The RAM is integrated into the processor, right? Having different amounts of RAM would ultimately having different processors for different iPad models. it would also mean different performance. They certainly could do that but they've never done so in the past.
    They have had different amounts of RAM for the same generation SoC. I assume, like with a core that is disabled, that they've disabled integrated RAM modules.
  • Reply 15 of 16
    GeorgeBMacGeorgeBMac Posts: 11,421member
    MplsP said:
    I have no problem with a repairability score of 3 of 10 for an iPad -- because it is designed for the highest possible level of mobility consistent with a large screen.  

    Perhaps, but things like batteries and screens that need to be replaced with some frequency should be replaceable without too much difficulty.

    They are.   Just take it to your nearest Apple Store.  
    Oh wait!   Never mind....    :(
    cornchip
  • Reply 16 of 16
    Unfortunately neither of these teardowns understand that while TrueDepth and LiDAR both use infrared laser spots, they are fundamentally different technology. 

    TrueDepth throws a fine grid of laser points (randomized pattern for security), then looks at where the dots appear to be using parallax. This works great at close range but the accuracy falls dramatically with distance. So perfect for mapping a face but even at 1.5 meters (5 feet) the accuracy has really degraded. This makes it work poorly for AR.

    LiDAR actually chops (modulates) the intensity of each laser beam and measures how long it takes for the light echo to return. Super cool, and very hard. This gives great distance accuracy (millimeters), but is hard so you don’t have nearly as many spots. Its accuracy does not meaningfully degrade with distance, though eventually the echo becomes faint enough off of dark surfaces that it can’t measure any more (5 meters or 15 feet for this sensor). 

    So they are different sensors for different uses. One is appropriate for FaceID, while the other is appropriate for AR. 
    watto_cobra
Sign In or Register to comment.