Apple's first iOS 10.1 beta enables iPhone 7 Plus bokeh 'Portrait' mode
Promised to arrive later this year, the new "Portrait" mode utilizing the dual-lens design of the iPhone 7 Plus camera is available to test now with the newly released iOS 10.1 beta for developers.
Via TechCrunch.
Though it wasn't mentioned in the release notes, "Portrait" is now available to select in the native Camera app on iPhone 7 Plus units running iOS 10.1 beta 1. Initial photos demonstrating the new capabilities were published on Wednesday by TechCrunch.
When shooting photos in "Portrait" mode, users must lock onto their subject to separate it from the background. This simulates what is known as a "bokeh" effect in photography.
Instructions at the bottom of the screen inform the user whether or not there is enough light in their shot, and also whether they are too close or too far from the subject. Photos captured in this mode are labeled with "Depth Effect."
With proprietary range finding technology, the iPhone 7 Plus dual cameras can produce a selectively out-of-focus portrait. While the feature was demonstrated at the iPhone 7 Plus unveiling earlier this month, it did not ship with the device and is set to arrive in the iOS 10.1 software update later this fall.
Via TechCrunch.
Though it wasn't mentioned in the release notes, "Portrait" is now available to select in the native Camera app on iPhone 7 Plus units running iOS 10.1 beta 1. Initial photos demonstrating the new capabilities were published on Wednesday by TechCrunch.
When shooting photos in "Portrait" mode, users must lock onto their subject to separate it from the background. This simulates what is known as a "bokeh" effect in photography.
Instructions at the bottom of the screen inform the user whether or not there is enough light in their shot, and also whether they are too close or too far from the subject. Photos captured in this mode are labeled with "Depth Effect."
With proprietary range finding technology, the iPhone 7 Plus dual cameras can produce a selectively out-of-focus portrait. While the feature was demonstrated at the iPhone 7 Plus unveiling earlier this month, it did not ship with the device and is set to arrive in the iOS 10.1 software update later this fall.
Comments
(not only the subject but they have to mask a tree behind them for another layer as well. This is Hollywood)
From Cambridge in Color web site:
In this picture (small scale) I took with a 100-400 mm Canon L lens note the Bokeh's shapes in the back ground.
EDIT: My comment was simply to point out Apple is not creating a bokeh effect, rather a background blur.
The blur is the easiest part.
i suspect thay just as with real DOF effects, the results will vary by photographer. no amount of tech will make any given photo awesome.
Aaaaand this is why art people are so annoying..
It looks like Apple is using a simple Gaussian Blur effect in this v1 feature. This can be rendered pretty much instantly in software.
Photoshop has a sophisticated Lens Blur filter which is so processor-intensive that it is still unable to be applied to a Smart Object, but it does produce a fairly realistic synthetic Bokeh. It even allows an alpha channel to be used as a depth map.
As another poster pointed out, manually drawing a mask takes a skilled artist a lot of time. I do lots of photo editing, so I can backup what that poster says.
The real breakthrough here is being able to automatically create the depth map AT ALL. That is the trick upon which all the rest of this feature is built.
Obvious improvements to this v1 feature:
- Capturing a more detailed depth map (like being able to resolve the frayed rope ends in the parrot image) would be a huge improvement.
- Finding the GHz to power a more realistic photoshop-level Lens Blur is an obvious improvement that seems to be inevitable at the A series chips get so much better every year. I don't want to minimize the complexity (and probably engineering around Adobe patents) that would be involved in replicating the Lens Blur from Photoshop into an iPhone, but Apple seems to find ways to implement things.
These obvious improvements give Apple a clear roadmap to follow as they develop this feature.What I would LOVE:
Let the iPhone spit out its Depth Map channel as an 8 bit grayscale image so I can process blur (or some other effect) myself in post. I'm assuming this will be something 3rd party camera apps will make available at some point. There have to be lots of creative ways a quick depth map could be used in image editing.