Apple's first iOS 10.1 beta enables iPhone 7 Plus bokeh 'Portrait' mode

124»

Comments

  • Reply 61 of 63
    MacPromacpro Posts: 19,873member
    foggyhill said:
    matrix077 said:
    shogun said:
    matrix077 said:
    Woah... how can they mask the subject in real time?
    I presume the two cameras create a depth map and then software applies blur according to a distance algorithm. Remember years ago when Jony explained that the blur in iOS and macOS was non-obvious and actually took a lot of effort to get right? They may be using the same blur technique here.
    Using Photoshop, masking the subject and tree behind her will take me an hour at least, and that's a rough mask. We're talking about hair and trunk here.
    The blur is the easiest part. 
    You could do that picture of the woman in Photoshop in <1 minute.  If you look closely the hair is blurred at the edges too and the tree isn't masked at all, it's not a very accurate mask such as some of the add on plug-ins for Photoshop can achieve, not to mention in Photoshop, once masked you could use a tilt or field blur to make the DOF look more realistic.  That said, I'm sure it's quite good enough for most folks and a pleasing effect for some images.

    This thing is done real time, which is something quite new. The fact you can play with it (like you could with a DSLR) is something people haven't really been able to do.
    No argument.  I was simply replying to the poster claiming it would take "an hour at least" to do manually in Photoshop.
     0Likes 0Dislikes 0Informatives
  • Reply 62 of 63
    Marvinmarvin Posts: 15,555moderator
    polymnia said:
    True Bokeh isn't simply a blur.  

    From Cambridge in Color web site:

    "Note that depth of field only sets a maximum value for the circle of confusion, and does not describe what happens to regions once they become out of focus. These regions are also called "bokeh," from Japanese (pronounced bo-ké). Two images with identical depth of field may have significantly different bokeh, as this depends on the shape of the lens diaphragm. In reality, the circle of confusion is usually not actually a circle, but is only approximated as such when it is very small. When it becomes large, most lenses will render it as a polygonal shape with 5-8 sides."

    In this picture (small scale) I took with a 100-400 mm Canon L lens note the Bokeh's shapes in the back ground.

    This image is a great example of what Apple should be aiming for.

    It looks like Apple is using a simple Gaussian Blur effect in this v1 feature. This can be rendered pretty much instantly in software.

    Photoshop has a sophisticated Lens Blur filter which is so processor-intensive that it is still unable to be applied to a Smart Object, but it does produce a fairly realistic synthetic Bokeh. It even allows an alpha channel to be used as a depth map.

    As another poster pointed out, manually drawing a mask takes a skilled artist a lot of time. I do lots of photo editing, so I can backup what that poster says.

    The real breakthrough here is being able to automatically create the depth map AT ALL. That is the trick upon which all the rest of this feature is built.

    Obvious improvements to this v1 feature:
    • Capturing a more detailed depth map (like being able to resolve the frayed rope ends in the parrot image) would be a huge improvement.
    • Finding the GHz to power a more realistic photoshop-level Lens Blur is an obvious improvement that seems to be inevitable at the A series chips get so much better every year. I don't want to minimize the complexity (and probably engineering around Adobe patents) that would be involved in replicating the Lens Blur from Photoshop into an iPhone, but Apple seems to find ways to implement things.
    These obvious improvements give Apple a clear roadmap to follow as they develop this feature.

    What I would LOVE:
    Let the iPhone spit out its Depth Map channel as an 8 bit grayscale image so I can process blur (or some other effect) myself in post. I'm assuming this will be something 3rd party camera apps will make available at some point. There have to be lots of creative ways a quick depth map could be used in image editing.
    Yes the Photoshop bokeh effect looks good. Without a depth map, it has some unwanted effects:

    http://www.shutterstock.com/blog/how-to-blur-background-photoshop-bokeh-effect

    bokeh3



    The filtered lights there overlap the arms without depth info. Just masking that out would make it look fine. The TechCrunch site says the iPhone does 9 levels of depth so it's not much to work with but as long as it can tell foreground from background, it can improve those kind of images significantly and without manual work. Being able to save the depth map would be very useful, especially if it can work for video too, where masking is much more time-consuming.

    On a smartphone, the features are there to improve image quality in an effortless way, they don't have to be perfect but having the ability to adjust the image with the depth maps would allow for more control where needed. Having depth of field blur really makes the subjects stand out:

    cans
    polymnia
     1Like 0Dislikes 0Informatives
Sign In or Register to comment.