Portrait Mode on iPhone SE relies only on machine learning
Apple's new iPhone SE is the company's first -- and thus far, only -- iPhone to solely rely on machine learning for Portrait Mode depth estimation.
The iPhone SE can create depth maps from a single 2D image using machine learning. Credit: Halide
The iPhone SE, released in April, appears to be largely a copy of the iPhone 8, right down to the single-lens monocular camera. But, under the hood, there's much more going on for depth estimation than any other iPhone before it.
According to a blog post from the makers of camera app Halide, the iPhone SE is the first in Apple's lineup to use "Single Image Monocular Depth Estimation." That means it's the first iPhone that can create a portrait blur effect using just a single 2D image.
In past iPhones, Portrait Mode has required at least two cameras. That's because the best source of depth information has long been comparing two images coming from two slightly different places. Once the system compares those images, it can separate the subject of a photo from the background, allowing for the blurred or "bokeh effect."
The iPhone XR changed that, introducing Portrait Mode support through the use of sensor "focus pixels," which could produce a rough depth map. But while the new iPhone SE has focus pixels, its older hardware lacks the coverage requisite for depth-mapping purposes.
"The new iPhone SE can't use focus pixels, because its older sensor doesn't have enough coverage," Halide's Ben Sandofsky wrote. An iFixit teardown revealed on Monday that the iPhone SE's camera sensor is basically interchangeable with the iPhone 8's.
Instead, the entry-level iPhone produces depth maps entirely through machine learning. That also means that it can produce Portrait Mode photos from both its front- and rear-facing cameras. That's something undoubtedly made possible by the top-of-the-line A13 Bionic chipset underneath its hood.
The depth information isn't perfect, Halide points out, but it's an impressive feat given the relative hardware limitations of a three-year-old, single-sensor camera setup. Similarly, Portrait Mode on the iPhone SE only works on people, but Halide says the new version of its app allows bokeh effects on non-human subjects on the iPhone SE.
The iPhone SE can create depth maps from a single 2D image using machine learning. Credit: Halide
The iPhone SE, released in April, appears to be largely a copy of the iPhone 8, right down to the single-lens monocular camera. But, under the hood, there's much more going on for depth estimation than any other iPhone before it.
According to a blog post from the makers of camera app Halide, the iPhone SE is the first in Apple's lineup to use "Single Image Monocular Depth Estimation." That means it's the first iPhone that can create a portrait blur effect using just a single 2D image.
In past iPhones, Portrait Mode has required at least two cameras. That's because the best source of depth information has long been comparing two images coming from two slightly different places. Once the system compares those images, it can separate the subject of a photo from the background, allowing for the blurred or "bokeh effect."
The iPhone XR changed that, introducing Portrait Mode support through the use of sensor "focus pixels," which could produce a rough depth map. But while the new iPhone SE has focus pixels, its older hardware lacks the coverage requisite for depth-mapping purposes.
"The new iPhone SE can't use focus pixels, because its older sensor doesn't have enough coverage," Halide's Ben Sandofsky wrote. An iFixit teardown revealed on Monday that the iPhone SE's camera sensor is basically interchangeable with the iPhone 8's.
Instead, the entry-level iPhone produces depth maps entirely through machine learning. That also means that it can produce Portrait Mode photos from both its front- and rear-facing cameras. That's something undoubtedly made possible by the top-of-the-line A13 Bionic chipset underneath its hood.
The depth information isn't perfect, Halide points out, but it's an impressive feat given the relative hardware limitations of a three-year-old, single-sensor camera setup. Similarly, Portrait Mode on the iPhone SE only works on people, but Halide says the new version of its app allows bokeh effects on non-human subjects on the iPhone SE.
Comments
Still, it's cool that they added the feature; since it's software based, I'm hoping they can add it to other older phones as well.
However that said, all smartphone portrait modes have mixed results, sometimes they're good, sometimes they suck, and there are somethings that can't be properly simulated (e.g. when photographing objects that have lensed the background or a curved reflection.)
I did take a couple photos of old framed pictures last year for another family member, but as they already have realistic bokeh from the time they were taken over 60 years ago adding more would result in something fake looking. I can't imagine ever wanting to use it in that manner myself but it's there if someone might want to do it anyway.
Now as far as Apple using machine learning only (out of necessity) for applying the portrait effect to three-dimensional scenes using just the one lens that's cool, and very well done too IMO, but I'd be shocked if it requires the processing power of an A13 to do so. It's become a relatively common feature on recent lower-priced smartphones with lesser processors and resources such as even a Qualcomm 215 processor and just 1GB ram.