Apple to launch iPad Pro with 3D sensing, 'iPhone SE 2' in first half of 2020
Apple is on schedule to debut an iPad Pro model with new 3D sensing equipment, as well as a long-rumored iPhone SE follow-up, in the first half of 2020, according to TF Securities Analyst Ming-Chi Kuo.
Ming-Chi Kuo in a note to investors on Wednesday said Apple will integrate a time-of-flight sensor into the iPad Pro rear-facing camera system, reports MacRumors.
Similar to the TrueDepth camera system, TOF systems generate highly accurate depth maps by calculating light. Instead of analyzing structured light patterns, like those emitted by TrueDepth VCSEL dot projectors, TOF modules measure the time it takes pulses of light to travel to and from a target surface. Not only is the resulting depth data more accurate, but TOF sensors can operate at longer distances than structured light solutions.
Apple's interest in the advanced imaging technology dates back to early 2018 and is expected to see use in augmented reality applications. More recently, reports claimed the company was looking at 3D sensor hardware from existing camera supplier Sony.
Kuo's prediction lines up with a report from Bloomberg, which earlier this week said Apple is planning to release a next-generation iPad with 3D sensor and dual-camera array in 2020.
Beyond iPad Pro, the analyst reiterated previous forecasts of a so-called "iPhone SE 2" launch sometime in the first half of 2020.
Much to the chagrin of fans of small form factor handsets, the budget iPhone is expected to sport a design and screen size borrowed from iPhone 8. Kuo believes Apple will rely on its new A13 Bionic processor, introduced with iPhone 11 and 11 Pro, to power the device with the help of 3GB of RAM.
Apple's "iPhone SE 2" is anticipated to carry a price tag of $399 and come in 64GB and 128GB varieties.
Ming-Chi Kuo in a note to investors on Wednesday said Apple will integrate a time-of-flight sensor into the iPad Pro rear-facing camera system, reports MacRumors.
Similar to the TrueDepth camera system, TOF systems generate highly accurate depth maps by calculating light. Instead of analyzing structured light patterns, like those emitted by TrueDepth VCSEL dot projectors, TOF modules measure the time it takes pulses of light to travel to and from a target surface. Not only is the resulting depth data more accurate, but TOF sensors can operate at longer distances than structured light solutions.
Apple's interest in the advanced imaging technology dates back to early 2018 and is expected to see use in augmented reality applications. More recently, reports claimed the company was looking at 3D sensor hardware from existing camera supplier Sony.
Kuo's prediction lines up with a report from Bloomberg, which earlier this week said Apple is planning to release a next-generation iPad with 3D sensor and dual-camera array in 2020.
Beyond iPad Pro, the analyst reiterated previous forecasts of a so-called "iPhone SE 2" launch sometime in the first half of 2020.
Much to the chagrin of fans of small form factor handsets, the budget iPhone is expected to sport a design and screen size borrowed from iPhone 8. Kuo believes Apple will rely on its new A13 Bionic processor, introduced with iPhone 11 and 11 Pro, to power the device with the help of 3GB of RAM.
Apple's "iPhone SE 2" is anticipated to carry a price tag of $399 and come in 64GB and 128GB varieties.
Comments
oh no no no. I ain’t playing this game. Sure there’s always something great just six months out, then another six months, then another. The next thing you know it’s 2019 and you’re still running an iPhone 3GS. I am getting an 11 and that’s that.
But, the screen resolution needs to be much higher (double?) in order to get me to buy.
Cost is not an issue.
The one I have is "good enough".
Really hope it comes in the first quarter for 2020.
You can already do 3D scanning with the front-facing VCSEL stuff in the True Depth camera system; as I said I'm already doing it with my iPhone X. The app I mentioned can output common 3D formats that I can import into and open in Xcode for VR or import into any 3D app, etc. So, not sure what you're talking about.
I'm not "assuming", this is literally from the article:
Now if you're going to generate a 3D map of an area, why would you put those sensors on the front-facing camera array?