Photographer showcases upcoming Portrait mode using Apple's iPhone 7 Plus at wedding
Reddit users claiming to be professional photographers proclaim the iPhone 7 Plus in Portrait mode in iOS 10.1 an excellent, high-quality photography tool when coupled with the appropriate user skillset.
The iPhone 7 Plus was used by Reddit member "Rytterfalk" for about 70 percent of the shots at a wedding. The photographer claimed that some shots required the use of a DSLR for future use of the photos, but it was "much harder to get the moment the same way as you can with a phone."
All shots were taken with natural light, according to the photographer.
"Portrait" is now available to select in the native Camera app on iPhone 7 Plus units running any of the iOS 10.1 betas. When shooting photos in "Portrait" mode, users must lock onto their subject to separate it from the background, forcing the iPhone to simulate what is known as a "bokeh" effect in photography.
Instructions at the bottom of the screen inform the user whether or not there is enough light in their shot, and also whether they are too close or too far from the subject. Photos captured in this mode are labeled with "Depth Effect."
With proprietary range finding technology, the iPhone 7 Plus dual cameras can produce a selectively out-of-focus portrait. While the feature was demonstrated at the iPhone 7 Plus unveiling, it did not ship with the device and is set to arrive for all users with the full iOS 10.1 software update later this fall.
The iPhone 7 Plus was used by Reddit member "Rytterfalk" for about 70 percent of the shots at a wedding. The photographer claimed that some shots required the use of a DSLR for future use of the photos, but it was "much harder to get the moment the same way as you can with a phone."
All shots were taken with natural light, according to the photographer.
"Portrait" is now available to select in the native Camera app on iPhone 7 Plus units running any of the iOS 10.1 betas. When shooting photos in "Portrait" mode, users must lock onto their subject to separate it from the background, forcing the iPhone to simulate what is known as a "bokeh" effect in photography.
Instructions at the bottom of the screen inform the user whether or not there is enough light in their shot, and also whether they are too close or too far from the subject. Photos captured in this mode are labeled with "Depth Effect."
With proprietary range finding technology, the iPhone 7 Plus dual cameras can produce a selectively out-of-focus portrait. While the feature was demonstrated at the iPhone 7 Plus unveiling, it did not ship with the device and is set to arrive for all users with the full iOS 10.1 software update later this fall.
Comments
I wish people would stop calling Portrait mode "bokeh" because it is not. It is just a blurred background. Bokeh is different. It has individual convolution areas and swirls not just a Gaussian blur. I have to admit the masking capability is impressive. If I could get the masking feature alone without the blur, it would be helpful in compositing work.
That picture of the bride and groom together looks terrible.
The depth of field looks ok in these pictures (better than none), I have not yet seen a bokeh effect anywhere.
Apple's implementation is not a Gaussian blur, and it takes into account distance from the in-focus plane. It most certainly is (simulated) bokeh.
No, that's not the definition of bokeh, that's just the description of one attribute of bokeh that can vary (for example, you can have most any arbitrary solid shape for the patterns).
During this testing, someone needs to use a scene with some point-like light sources. Apple's implementation doesn't use a Gaussian blur, so there should be proper bokeh-like effects.
what some people don't realize is while bokeh often yields 5-sided blur artifacts, this is based on the number of fins in the shutter -- and that the finer the camera the more fins and thus the more circular the artifact shape is. thus Apple is going for a finer radial circle and not a typical 5-sided artifact.
This is how Wikipedia describes it. 1) none, 2) bokeh, 3) Gaussian blur
P.S. Is it just me or with the Jet Black color, the rose gold look a little dated?
The problem is that it's not a Gaussian blur. Just because you can't tell the difference doesn't make it so.
This wouldn't be a problem if you were just ignorant, but you are making a strong claim based on ignorance. Those "examples" you are talking about are just a subset of bokeh. Bokeh itself is just the out of focus area of a photograph. Every normal camera has bokeh, even past iPhones. You will not find that most iPhone photos have the same qualities that those examples you are referring to have.
Those extreme effects you are thinking about are most noticeable when there is a small, bright light source contrasted against a darker area, like with Christmas lights. You don't see that as readily in photos without such light sources.
Another place you find it is in looking through trees where the sunlight or bright sky shines through. In the first photo, you can see this. If it were just a Gaussian blur, the darker tree area would blur into the bright sky area equally to the amount the sky blurs into the tree, creating a simple gradient, not the true bokeh effect that you see in that photo.
As for how it's created, let's think about this...the 7 Plus has two cameras. Couldn't you use the longer lens to capture the main subject (the entire image would probably need to be in frame for this camera), but then use the shorter lens to capture the scene again but slightly out of focus. Use a mask around the main subject to merge the two images together. From the article there seems to be a fairly strict positioning and distance requirement for taking these photos so that may be due to the need to frame the scene within the longer lens.
Speaking of which, this photographer's statement that
doesn't really jibe with
Sounds like there are still far more limiting factors than with a DSLR.
Don't get me wrong, the iPhone camera has come a long way and my 6s has finally replaced my point-and-shoot (it's a pretty good point-and-shoot). But if a professional photograph said they were going to shoot my wedding with an iPhone 7 Plus they would not get the job.
All that is being pushed here is the potential for a bunch of wannabe armchair picture-takers thinking they have what it takes to be a "real" photographer.
I'm just cracking-up at the back-and-forth ranting about the "bokeh" discussion. Apple's implementation is not anywhere near what true bokeh is. They're passing off a simple background blur - whatever "gaussian" or other name you want to call it - and the posters here claiming it to be bokeh. It is not. It's nice, but it has nothing to do with it. The aperture blades found in cameras also contribute to what real bokeh does, in addition to how the DOF is rendered in a way only analog lenses can do.
https://photographylife.com/what-is-bokeh
That's all that I'm saying. It is bokeh, in spite of what snobs are saying. They don't have to like the bokeh, but it's there (and it's not just a Gaussian blur).
No, it's not just a simple blur. You are confusing "something I like" with "bokeh". It's the True Scotsman fallacy. "That's not a pizza, it has chicken on it". "That's not bokeh, it doesn't look as nice as my dSLR!"
Bokeh is simply the noticeably out of focus area of a photo. That's it. All iPhones have had bokeh. But due to the wide depth of field, it's very limited. This is a software process that simulates bokeh NOT SIMPLY WITH JUST A BLUR, but it takes into account distance and shows bright areas expanding into darker areas, not just blurring the two into each other.
Are SLRs better? Almost universally yes, you don't even have to look at any test photos to tentatively assume this. No one is saying it's exactly just as good as a nice Nikon or Canon with a fast lens. But it's nice, impressive, not just a Gaussian blur, and is a good simulation of bokeh.