Second 2020 iPad teardown shows how LiDAR differs from Face ID
A second teardown of the new 12.9-inch iPad Pro has shown the insides of the latest model are quite similar to that of the previous incarnation, while a demonstration of the LiDAR addition reveals it isn't going to offer the same level of sensitivity as the TrueDepth camera array.
The rear camera module removed from the 2020 iPad Pro (via iFixit)
The second within the space of a few days, the latest customary teardown of the 2020 iPad Pro by iFixit is an unusual affair, as it is released primarily on video. Recorded under a lockdown caused by the coronavirus, the teardown is brief, but follows the same procedure as the repair outfit's earlier publications, but with some differences.
Separating the display from the rest of the iPad relied on guitar picks and a hairdryer, rather than the use of a warming pad to release adhesives. As with the 2018 iPad Pro, the rear cover has to be twisted for the removal of cables and screwed-on shields to fully free it.
The new camera module separates via a few screws, with a 10-megapixel ultra-wide module along with a 12-megapixel wide camera, and the LiDAR scanner, which as previously revealed is made up of two lens-capped modules stacked on top of each other. It is speculated the modules consist of a VCSEL transmitter and a receiving sensor, with the former putting out an array of infrared dots that are picked up by the sensor.
By using an infrared camera, the teardown found the LiDAR system emits a regular pattern of dots, considerably fewer than used by the TrueDepth camera. As it isn't meant for Face ID-style applications, it seems this iteration is just for more simplified depth mapping over a wider range, instead of requiring finer measurements of a face.
Infrared dot projections for the LiDAR module (left), TrueDepth camera (right) (via iFixit)
The front-facing cameras are removed in a single assembly, with similar hardware as the previous module. The USB-C port is still modular at the base, rather than a hard-wired component, making it a prime candidate for easy repairs.
The logic board is, as typical for iPads, glued to the inside with wires running underneath it, and is flanked by the batteries. On the board are the A12Z Bionic chip with 6GB of RAM, up from 4GB in the previous models.
Batteries are held in place with stretch-release adhesive, though regular adhesive is still used for some areas, which still makes it tough to replace them. The two cells have a total capacity of 36.59 watt-hours, the same as the model it is replacing.
In iFixit's summing up of the new model, it is deemed to have "pretty abysmal repair procedures" despite the incremental upgrades for users. The use of adhesives and precarious prying leads to a "repairability" score of just 3 out of 10.
The rear camera module removed from the 2020 iPad Pro (via iFixit)
The second within the space of a few days, the latest customary teardown of the 2020 iPad Pro by iFixit is an unusual affair, as it is released primarily on video. Recorded under a lockdown caused by the coronavirus, the teardown is brief, but follows the same procedure as the repair outfit's earlier publications, but with some differences.
Separating the display from the rest of the iPad relied on guitar picks and a hairdryer, rather than the use of a warming pad to release adhesives. As with the 2018 iPad Pro, the rear cover has to be twisted for the removal of cables and screwed-on shields to fully free it.
The new camera module separates via a few screws, with a 10-megapixel ultra-wide module along with a 12-megapixel wide camera, and the LiDAR scanner, which as previously revealed is made up of two lens-capped modules stacked on top of each other. It is speculated the modules consist of a VCSEL transmitter and a receiving sensor, with the former putting out an array of infrared dots that are picked up by the sensor.
By using an infrared camera, the teardown found the LiDAR system emits a regular pattern of dots, considerably fewer than used by the TrueDepth camera. As it isn't meant for Face ID-style applications, it seems this iteration is just for more simplified depth mapping over a wider range, instead of requiring finer measurements of a face.
Infrared dot projections for the LiDAR module (left), TrueDepth camera (right) (via iFixit)
The front-facing cameras are removed in a single assembly, with similar hardware as the previous module. The USB-C port is still modular at the base, rather than a hard-wired component, making it a prime candidate for easy repairs.
The logic board is, as typical for iPads, glued to the inside with wires running underneath it, and is flanked by the batteries. On the board are the A12Z Bionic chip with 6GB of RAM, up from 4GB in the previous models.
Batteries are held in place with stretch-release adhesive, though regular adhesive is still used for some areas, which still makes it tough to replace them. The two cells have a total capacity of 36.59 watt-hours, the same as the model it is replacing.
In iFixit's summing up of the new model, it is deemed to have "pretty abysmal repair procedures" despite the incremental upgrades for users. The use of adhesives and precarious prying leads to a "repairability" score of just 3 out of 10.
Comments
Is that confirmed to be every new iPad Pro? I thought it was only the model with maximum storage.
Supposedly, every model of this new iPad Pro has 6 GB RAM.
So he's correct. If Truedepth is indeed better than we can be sure it will be added to later models as it compliments AR.
* I keep seeing LiDAR spelled this way. Just stick with lidar like we have for radar, laser and the uncommon maser (even though it predates laser and follows its pattern).
Probably because the lower case "i" has been all the rage since the early 2000s. I still cringe when I see knockoffs use it though.
I'm still unsure what's stopping Apple from making their lidar system more accurate or Truedepth have wider range.
I'm thinking in the future there may be Apple cameras that combine the accuracy of Truedepth and the range of Lidar.
Why are you assuming Apple is "stopping"? Clearly their years of work with lidar is being applied to devics to make them better, faster, and more accurate with every iteration. Because they didn't call it TrueDepth and it doesn't have the exact same feature set as lidar being used with other sensors for a very specific use case on the front of the camera? You don't want the system on the back to work for Face ID and you don't want Face ID to map a room. That should be clear.
TrueDepth throws a fine grid of laser points (randomized pattern for security), then looks at where the dots appear to be using parallax. This works great at close range but the accuracy falls dramatically with distance. So perfect for mapping a face but even at 1.5 meters (5 feet) the accuracy has really degraded. This makes it work poorly for AR.
LiDAR actually chops (modulates) the intensity of each laser beam and measures how long it takes for the light echo to return. Super cool, and very hard. This gives great distance accuracy (millimeters), but is hard so you don’t have nearly as many spots. Its accuracy does not meaningfully degrade with distance, though eventually the echo becomes faint enough off of dark surfaces that it can’t measure any more (5 meters or 15 feet for this sensor).
So they are different sensors for different uses. One is appropriate for FaceID, while the other is appropriate for AR.