Apple's first iOS 10.1 beta enables iPhone 7 Plus bokeh 'Portrait' mode

Posted:
in iPhone
Promised to arrive later this year, the new "Portrait" mode utilizing the dual-lens design of the iPhone 7 Plus camera is available to test now with the newly released iOS 10.1 beta for developers.


Via TechCrunch.


Though it wasn't mentioned in the release notes, "Portrait" is now available to select in the native Camera app on iPhone 7 Plus units running iOS 10.1 beta 1. Initial photos demonstrating the new capabilities were published on Wednesday by TechCrunch.

When shooting photos in "Portrait" mode, users must lock onto their subject to separate it from the background. This simulates what is known as a "bokeh" effect in photography.

Instructions at the bottom of the screen inform the user whether or not there is enough light in their shot, and also whether they are too close or too far from the subject. Photos captured in this mode are labeled with "Depth Effect."

With proprietary range finding technology, the iPhone 7 Plus dual cameras can produce a selectively out-of-focus portrait. While the feature was demonstrated at the iPhone 7 Plus unveiling earlier this month, it did not ship with the device and is set to arrive in the iOS 10.1 software update later this fall.
«134

Comments

  • Reply 1 of 63
    roakeroake Posts: 821member
    I wonder if we will see features like post-photo refocusing, 3D photos (perhaps with parallax effect), etc.
    netmagejony0
  • Reply 2 of 63
    Not bad. I wasn't sure what to expect, but I was afraid it might not be interesting at all. That is pretty good. It's no portrait lens on a DSLR, but it is pretty amazing. Given that phones and the iPhone in particular are already the most used cameras, I don't see how point and shoot cameras are still being made with photo results like these. 
    Deelronargonaut
  • Reply 3 of 63
    Woah... how can they mask the subject in real time?

    (not only the subject but they have to mask a tree behind them for another layer as well. This is Hollywood)
    edited September 2016
  • Reply 4 of 63
    MacProMacPro Posts: 19,851member
    True Bokeh isn't simply a blur.  

    From Cambridge in Color web site:

    "Note that depth of field only sets a maximum value for the circle of confusion, and does not describe what happens to regions once they become out of focus. These regions are also called "bokeh," from Japanese (pronounced bo-ké). Two images with identical depth of field may have significantly different bokeh, as this depends on the shape of the lens diaphragm. In reality, the circle of confusion is usually not actually a circle, but is only approximated as such when it is very small. When it becomes large, most lenses will render it as a polygonal shape with 5-8 sides."

    In this picture (small scale) I took with a 100-400 mm Canon L lens note the Bokeh's shapes in the back ground.

    edited September 2016 zroger73netmageanalogjackloopychewargonaut
  • Reply 5 of 63
    The user and all related content has been deleted.
  • Reply 6 of 63
    Now that I know how to pronounce "Bokeh" I feel better now, but please don't bring up another discusion of the maximum value for the circle of confusion. Remember this is a phone after all.
    redraider11Deelron
  • Reply 7 of 63
    MacProMacPro Posts: 19,851member
    Sox129 said:
    Now that I know how to pronounce "Bokeh" I feel better now, but please don't bring up another discusion of the maximum value for the circle of confusion. Remember this is a phone after all.
    I didn't that was just a quote from an excellent web site for photography beginners. 

    EDIT:  My comment was simply to point out Apple is not creating a bokeh effect, rather a background blur.
    edited September 2016 netmageargonaut
  • Reply 8 of 63
    matrix077 said:
    Woah... how can they mask the subject in real time?
    I presume the two cameras create a depth map and then software applies blur according to a distance algorithm. Remember years ago when Jony explained that the blur in iOS and macOS was non-obvious and actually took a lot of effort to get right? They may be using the same blur technique here.
    netmage
  • Reply 9 of 63
    The TechCrunch article actually explains the effect quite well: https://techcrunch.com/2016/09/21/hands-on-with-the-iphone-7-plus-crazy-new-portrait-mode/
    tgr1
  • Reply 10 of 63
    When I took photography in HS decades ago, we just called this shallow depth of field. None of there here fancy-language bokeh stuff (from the Japanese word bo-ke that means what exactly?).
  • Reply 11 of 63
    MacProMacPro Posts: 19,851member
    shogun said:
    The TechCrunch article actually explains the effect quite well: https://techcrunch.com/2016/09/21/hands-on-with-the-iphone-7-plus-crazy-new-portrait-mode/
    Good article and note it never used the term 'bokeh'. 
    edited September 2016
  • Reply 12 of 63
    matrix077 said:
    Woah... how can they mask the subject in real time?

    (not only the subject but they have to mask a tree behind them for another layer as well. This is Hollywood)
    100 billion operations in 25ms on the ISP. That's how.
    matrix077
  • Reply 13 of 63
    shogun said:
    matrix077 said:
    Woah... how can they mask the subject in real time?
    I presume the two cameras create a depth map and then software applies blur according to a distance algorithm. Remember years ago when Jony explained that the blur in iOS and macOS was non-obvious and actually took a lot of effort to get right? They may be using the same blur technique here.
    Using Photoshop, masking the subject and tree behind her will take me an hour at least, and that's a rough mask. We're talking about hair and trunk here.
    The blur is the easiest part. 
  • Reply 14 of 63
    Sox129 said:
    Now that I know how to pronounce "Bokeh" I feel better now, but please don't bring up another discusion of the maximum value for the circle of confusion. Remember this is a phone after all.
    I didn't that was just a quote from an excellent web site for photography beginners. I'm sure Apple programmers are more than capable of mimicking true bokeh effects. That example above from AI / Apple is just horrible though, looks like a bad Photoshop job.
    That example is from Techcrunch. 

    i suspect thay just as with real DOF effects, the results will vary by photographer. no amount of tech will make any given photo awesome. 
  • Reply 15 of 63
    I find Apple's "depth effect" to be little different than what software has been doing for decades - just more automated and easier for the everyday user. Even as an amateur photographer, it's usually easy for me to spot the difference between real and simulated bokeh. I suppose it can fool the masses, though. Then again, the masses also think Miley Cyrus = "good music" and Mozart = "bad music". :smile: (DISCLAIMER: I don't care for classical music, but I respect it.)
  • Reply 16 of 63
    shogun said:
    The TechCrunch article actually explains the effect quite well: https://techcrunch.com/2016/09/21/hands-on-with-the-iphone-7-plus-crazy-new-portrait-mode/
    Good article and note it never used the term 'bokeh'. 
    Because there is no bokeh. Your parrot photo however has many. 
  • Reply 17 of 63
    zroger73 said:
    I find Apple's "depth effect" to be little different than what software has been doing for decades - just more automated and easier for the everyday user. Even as an amateur photographer, it's usually easy for me to spot the difference between real and simulated bokeh. I suppose it can fool the masses, though. Then again, the masses also think Miley Cyrus = "good music" and Mozart = "bad music". :smile: (DISCLAIMER: I don't care for classical music, but I respect it.)
    Easy. This is just v.1. 
    nolamacguy
  • Reply 18 of 63
    True Bokeh isn't simply a blur.  

    Aaaaand this is why art people are so annoying..
    zroger73redraider11
  • Reply 19 of 63
    True Bokeh isn't simply a blur.  

    Aaaaand this is why art people are so annoying..
    Well, 'tis true that there's more to bokeh than averaging pixels. :neutral: 

    edited September 2016 nolamacguy
  • Reply 20 of 63
    polymniapolymnia Posts: 1,080member
    True Bokeh isn't simply a blur.  

    From Cambridge in Color web site:

    "Note that depth of field only sets a maximum value for the circle of confusion, and does not describe what happens to regions once they become out of focus. These regions are also called "bokeh," from Japanese (pronounced bo-ké). Two images with identical depth of field may have significantly different bokeh, as this depends on the shape of the lens diaphragm. In reality, the circle of confusion is usually not actually a circle, but is only approximated as such when it is very small. When it becomes large, most lenses will render it as a polygonal shape with 5-8 sides."

    In this picture (small scale) I took with a 100-400 mm Canon L lens note the Bokeh's shapes in the back ground.

    This image is a great example of what Apple should be aiming for.

    It looks like Apple is using a simple Gaussian Blur effect in this v1 feature. This can be rendered pretty much instantly in software.

    Photoshop has a sophisticated Lens Blur filter which is so processor-intensive that it is still unable to be applied to a Smart Object, but it does produce a fairly realistic synthetic Bokeh. It even allows an alpha channel to be used as a depth map.

    As another poster pointed out, manually drawing a mask takes a skilled artist a lot of time. I do lots of photo editing, so I can backup what that poster says.

    The real breakthrough here is being able to automatically create the depth map AT ALL. That is the trick upon which all the rest of this feature is built.

    Obvious improvements to this v1 feature:
    • Capturing a more detailed depth map (like being able to resolve the frayed rope ends in the parrot image) would be a huge improvement.
    • Finding the GHz to power a more realistic photoshop-level Lens Blur is an obvious improvement that seems to be inevitable at the A series chips get so much better every year. I don't want to minimize the complexity (and probably engineering around Adobe patents) that would be involved in replicating the Lens Blur from Photoshop into an iPhone, but Apple seems to find ways to implement things.
    These obvious improvements give Apple a clear roadmap to follow as they develop this feature.

    What I would LOVE:
    Let the iPhone spit out its Depth Map channel as an 8 bit grayscale image so I can process blur (or some other effect) myself in post. I'm assuming this will be something 3rd party camera apps will make available at some point. There have to be lots of creative ways a quick depth map could be used in image editing.
    edited September 2016 Deelronnolamacguymatrix077
Sign In or Register to comment.