Apple's first iOS 10.1 beta enables iPhone 7 Plus bokeh 'Portrait' mode

13

Comments

  • Reply 41 of 63
    matrix077 said:
    shogun said:
    matrix077 said:
    Woah... how can they mask the subject in real time?
    I presume the two cameras create a depth map and then software applies blur according to a distance algorithm. Remember years ago when Jony explained that the blur in iOS and macOS was non-obvious and actually took a lot of effort to get right? They may be using the same blur technique here.
    Using Photoshop, masking the subject and tree behind her will take me an hour at least, and that's a rough mask. We're talking about hair and trunk here.
    The blur is the easiest part. 
    You could do that picture of the woman in Photoshop in <1 minute.  If you look closely the hair is blurred at the edges too and the tree isn't masked at all, it's not a very accurate mask such as some of the add on plug-ins for Photoshop can achieve, not to mention in Photoshop, once masked you could use a tilt or field blur to make the DOF look more realistic.  That said, I'm sure it's quite good enough for most folks and a pleasing effect for some images.

    This thing is done real time, which is something quite new. The fact you can play with it (like you could with a DSLR) is something people haven't really been able to do.
  • Reply 42 of 63

    matrix077 said:
    shogun said:
    matrix077 said:
    Woah... how can they mask the subject in real time?
    I presume the two cameras create a depth map and then software applies blur according to a distance algorithm. Remember years ago when Jony explained that the blur in iOS and macOS was non-obvious and actually took a lot of effort to get right? They may be using the same blur technique here.
    Using Photoshop, masking the subject and tree behind her will take me an hour at least, and that's a rough mask. We're talking about hair and trunk here.
    The blur is the easiest part. 
    You could do that picture of the woman in Photoshop in <1 minute.  If you look closely the hair is blurred at the edges too and the tree isn't masked at all, it's not a very accurate mask such as some of the add on plug-ins for Photoshop can achieve, not to mention in Photoshop, once masked you could use a tilt or field blur to make the DOF look more realistic.  That said, I'm sure it's quite good enough for most folks and a pleasing effect for some images.
    The edge is still a bit rough, yes, but if you look "very closely". It's not very obvious. The 2 cameras capture the depth, so the capability is there, just that the software need more tuning to differentiate the contrast of depth for tiny details. As for Photoshop, I could ensure you that masking something like hair threads is not <1 min job, and it's done manually. This is for average person who don't even bother with photoshop and want everything instantly done but still good quality.
    edited September 2016
  • Reply 43 of 63
    MacProMacPro Posts: 19,727member
    True Bokeh isn't simply a blur.  



    I know, right. I can't see how it's possible for them to ever get accurate artificial bokeh, because of the edge problem of masking. If the edge is not sharp on the part that is meant to be sharp then it just looks like an amateur photoshop job. This is an effect that *is* worth pursuing but they should not release it until it's fully formed. They have to engineer fake shutter blade effects as well as some other work, however having said that, even done badly it's an improvement, but it should not be called bokeh, it should just be called something like 'portrait enhancement' mode.
    AI used the term bokeh in this article incorrectly, I'm not sure Apple use it ... but I could be wrong, I'll have to go and check.  The irony is bokeh is a side effect of the mechanics and deficiencies of analog hardware much like film grain, yet we covet it.  Kind of funny really.
  • Reply 44 of 63
    MacProMacPro Posts: 19,727member

    kevin kee said:
    Rayz2016 said:
    roake said:
    I wonder if we will see features like post-photo refocusing, 3D photos (perhaps with parallax effect), etc.
    You will not see that. In order to do that, you need to have a bit more than just two cameras.
    Doesn't the Lytro camera do post-photo refocusing?
    Yep.  I was hoping Apple were going that route, maybe they will in the future.
    I thought Lytro use special hexagonal lens? I don't think it's only the software.
    Right but I recall Apple patented a Lytro-like refocusable camera suitable for iPhone so I was hoping.... :)
  • Reply 45 of 63

    kevin kee said:
    Rayz2016 said:
    roake said:
    I wonder if we will see features like post-photo refocusing, 3D photos (perhaps with parallax effect), etc.
    You will not see that. In order to do that, you need to have a bit more than just two cameras.
    Doesn't the Lytro camera do post-photo refocusing?
    Yep.  I was hoping Apple were going that route, maybe they will in the future.
    I thought Lytro use special hexagonal lens? I don't think it's only the software.
    Right but I recall Apple patented a Lytro-like refocusable camera suitable for iPhone so I was hoping.... :)
    You mean LinX. LinX Imaging use similar Array concept like Lytro but the similarities end there. It's funny because multiple lenses technique is old, yet it's only now it's been implemented in Smartphone. I guess the advance in chip power helps that to happen.
  • Reply 46 of 63
    MacProMacPro Posts: 19,727member
    meteora said:
    Wow, I actually don't like that very much. It looks like someone heavily blurred it out in post-processing.
     ... and not very at that well, but it's just in beta so let's wait ... 
  • Reply 47 of 63
    fracfrac Posts: 480member
    True Bokeh isn't simply a blur.  

    Aaaaand this is why art people are so annoying..
    Aaaaand this is why people prone to gross generalizations are so annoying..
  • Reply 48 of 63
    I just noticed something significant that people has missed. In the image of the woman with a baby in the article, if you check the background, the degree of blurriness vary depending on the distant. For example, the tree is less blurry while the furthest scenery is blurred at max. I think this is a very smart implementation compare to the normal "blur in the background" or even Photoshop post processing which just blurred everything in a same degree regardless of the distant. Also, another thing, in the image of two puppies, noticed that the iPhone recognise the objects (puppies) in the mid-distant, and blurred the foreground slightly as well as the background.
    edited September 2016
  • Reply 49 of 63
    fracfrac Posts: 480member
    meteora said:
    Wow, I actually don't like that very much. It looks like someone heavily blurred it out in post-processing.
     ... and not very at that well, but it's just in beta so let's wait ... 
    I agree though to be honest - web quality size aside, the subject focus is off and nothing in the foreground is sharp enough to pop out of the fairly bland background blurring effect. It's almost like two layered focus planes with both exhibiting some defocussing to a degree. 
    I'll have to try and find some higher quality examples.
    Despite all that, it's encouraging.
  • Reply 50 of 63
    polymniapolymnia Posts: 1,080member
    kevin kee said:
    I just noticed something significant that people has missed. In the image of the woman with a baby in the article, if you check the background, the degree of blurriness vary depending on the distant. For example, the tree is less blurry while the furthest scenery is blurred at max. I think this is a very smart implementation compare to the normal "blur in the background" or even Photoshop post processing which just blurred everything in a same degree regardless of the distant. Also, another thing, in the image of two puppies, noticed that the iPhone recognise the objects (puppies) in the mid-distant, and blurred the foreground slightly as well as the background.
    I don't think anyone missed that. The whole point of the depth map is to apply blur differently according to distance as you have observed.

    Normal "blur in the background" operates in the same way. You may not often notice in pro photography, as photos are often composed with fully separated foreground and background so you cannot follow a shape from tack sharp to fully bokeh'd out. You might see a tack-sharp subject transitioning to super bokeh in artistic photography quite often, though.

    and, though it is a difficult edit to execute, Photoshop is quite capable of doing the same thing as this iPhone feature. And to a significantly greater level of polish. Look up the Lens Blur filter on YouTube. Lens Blur can use a Depth Channel and produces a much more convincing Bokeh, you can simulate different numbers of aperture blades, etc to really give the impression of a natural lens.

    the trick is making a depth map. That's where you may spend hours! Though it's surprising how forgiving the eye is if you make a tight mask and apply a gradual gradient from near to far background. 


    edited September 2016
  • Reply 51 of 63
    polymniapolymnia Posts: 1,080member
    volcan said:
    polymnia said:
     I don't want to minimize the complexity (and probably engineering around Adobe patents) that would be involved in replicating the Lens Blur from Photoshop into an iPhone, but Apple seems to find ways to implement things.
    As you probably know there is a huge debate about software patents. In this case the software is attempting to replicate a natural phenomena - the visual effect that is created by a lens. That would be like patenting the laws of physics. Copyright is a different matter as that would involve copying code or UI. There are so many ways to skin a cat that it would be unlikely to present a problem. Apple would probably write it in Swift and I'm pretty sure Photoshop is all C++.

    Adobe clearly doesn't own the idea of simulating a real Lens Blur.

    what they own is the specific code implementation they have developed and market as part of Photoshop.

    I really doubt Apple would copy Adobe's tech. They would either seek a license or develop their own techniques.

    the Lens Blur filter is one of the newer Blur filters in Photoshop, while the Gaussian Blur that Apple appears to be using here has been around since Photoshop version 2 (at least, that's as far back as I know). If it took Adobe 20+ years to craft that tech, I'd assume it's not an insignificant task. And even Adobe's implementation is not real-time. It can't even be attached to a Smart Object, one of the few filters that cannot be attached as a live effect that way. Even the famously processor-intensive Liquify filter can be attached to a Smart Object.

    making this technique work in real time will be quite a challenge.

    I was intrigued by the specific mention the ISP chip got in the iPhone event. APple rarely mentions specific pieces of silicon unless they have decided to make that component a priority (A-series processors, M-series motion processors, W-series wireless audio chips). And if the ISP chip becomes a priority, perhaps they can build it specifically to run a photorealistic lens simulation blur.

    if Apple follows this strategy of optimizing a chip to execute the task, I doubt copying someone else's code would appeal to them. They would write software they know their chip designers can accelerate. It's exciting to think of what they may be able to do in a few years.
    edited September 2016
  • Reply 52 of 63
    polymnia said:
    kevin kee said:
    I just noticed something significant that people has missed. In the image of the woman with a baby in the article, if you check the background, the degree of blurriness vary depending on the distant. For example, the tree is less blurry while the furthest scenery is blurred at max. I think this is a very smart implementation compare to the normal "blur in the background" or even Photoshop post processing which just blurred everything in a same degree regardless of the distant. Also, another thing, in the image of two puppies, noticed that the iPhone recognise the objects (puppies) in the mid-distant, and blurred the foreground slightly as well as the background.
    I don't think anyone missed that. The whole point of the depth map is to apply blur differently according to distance as you have observed.

    Normal "blur in the background" operates in the same way. You may not often notice in pro photography, as photos are often composed with fully separated foreground and background so you cannot follow a shape from tack sharp to fully bokeh'd out. You might see a tack-sharp subject transitioning to super bokeh in artistic photography quite often, though.

    and, though it is a difficult edit to execute, Photoshop is quite capable of doing the same thing as this iPhone feature. And to a significantly greater level of polish. Look up the Lens Blur filter on YouTube. Lens Blur can use a Depth Channel and produces a much more convincing Bokeh, you can simulate different numbers of aperture blades, etc to really give the impression of a natural lens.

    the trick is making a depth map. That's where you may spend hours! Though it's surprising how forgiving the eye is if you make a tight mask and apply a gradual gradient from near to far background. 


    This is very interesting. Do you know how Adobe detect the distant of the objects? Or is that depending on the user to determine every single objects and apply the effect individually and manually? Just curious. Perhaps some camera can offer this information in RAW data which then software like Photoshop able to utilize, but I doubt.
  • Reply 53 of 63
    polymniapolymnia Posts: 1,080member
    kevin kee said:
    polymnia said:
    kevin kee said:
    I just noticed something significant that people has missed. In the image of the woman with a baby in the article, if you check the background, the degree of blurriness vary depending on the distant. For example, the tree is less blurry while the furthest scenery is blurred at max. I think this is a very smart implementation compare to the normal "blur in the background" or even Photoshop post processing which just blurred everything in a same degree regardless of the distant. Also, another thing, in the image of two puppies, noticed that the iPhone recognise the objects (puppies) in the mid-distant, and blurred the foreground slightly as well as the background.
    I don't think anyone missed that. The whole point of the depth map is to apply blur differently according to distance as you have observed.

    Normal "blur in the background" operates in the same way. You may not often notice in pro photography, as photos are often composed with fully separated foreground and background so you cannot follow a shape from tack sharp to fully bokeh'd out. You might see a tack-sharp subject transitioning to super bokeh in artistic photography quite often, though.

    and, though it is a difficult edit to execute, Photoshop is quite capable of doing the same thing as this iPhone feature. And to a significantly greater level of polish. Look up the Lens Blur filter on YouTube. Lens Blur can use a Depth Channel and produces a much more convincing Bokeh, you can simulate different numbers of aperture blades, etc to really give the impression of a natural lens.

    the trick is making a depth map. That's where you may spend hours! Though it's surprising how forgiving the eye is if you make a tight mask and apply a gradual gradient from near to far background. 


    This is very interesting. Do you know how Adobe detect the distant of the objects? Or is that depending on the user to determine every single objects and apply the effect individually and manually? Just curious. Perhaps some camera can offer this information in RAW data which then software like Photoshop able to utilize, but I doubt.

    In the adobe filter, distance is manually applied, but it is all applied at once as you invoke the Lens Blur filter.

    Adobe's Lens Blur lets you specify a Depth Channel or Depth Mask, I forget the exact term. Basically you make it using similar techniques as you'd use to create a conventional mask. The Depth Channel looks much like the grayscale Depth map illustrations you've seen in articles about how the iPhone's feature works, except instead of being auto-generated, they are drawn by hand in post-processing. It's a grayscale image and on one end of the tonal range (I can't remember wether it's black or white) the Blur is not applied at all and on the other end it is applied heavily. Specular highlights are blown into the 'circles of confusion' you will read about when studying bokeh, this is based on this Depth Map as well. Basically all the individual lens-simulating features of the filter use the Depth map in different ways to determine how they are rendered in different areas of the image.

    its really an impressive effect, though as I mentioned before, it requires a lot of prep work and skill to execute well.
    edited September 2016
  • Reply 54 of 63
    kevin kee said:
    I just noticed something significant that people has missed. In the image of the woman with a baby in the article, if you check the background, the degree of blurriness vary depending on the distant. For example, the tree is less blurry while the furthest scenery is blurred at max.
    I don't miss it. Checks post #3. (Yes, sometimes I'd like to boast. :smiley:)
  • Reply 55 of 63
    kevin kee said:
    I just noticed something significant that people has missed. In the image of the woman with a baby in the article, if you check the background, the degree of blurriness vary depending on the distant. For example, the tree is less blurry while the furthest scenery is blurred at max.
    I don't miss it. Checks post #3. (Yes, sometimes I'd like to boast. :smiley: )

    This is so good. Can't wait for v.3 in a couple of years. 
    edited September 2016
  • Reply 56 of 63
    Here's a photo I took in Chicago that has bokeh, It  took 125mm shot pretty close. It's a lovely effect when done right.
    Took it at the maun museum near the waterfront.



    It was taken with a SLR (no D) in 2004.
    edited September 2016
  • Reply 57 of 63
    They should have just used the word "blur" rather than "bokeh". For those that care, even though the Japanese origin has something to do with blur, there's more meaning to it. It's pretty interesting if you're really into photography.
  • Reply 58 of 63
    bigmike said:
    They should have just used the word "blur" rather than "bokeh". For those that care, even though the Japanese origin has something to do with blur, there's more meaning to it. It's pretty interesting if you're really into photography.
    I think the word 'bokeh' was only used by media. Apple itself never used the term and just simply called it ‘Portrait Mode’. Correct me if I’re wrong.
  • Reply 59 of 63
    volcan said:
    polymnia said:
     I don't want to minimize the complexity (and probably engineering around Adobe patents) that would be involved in replicating the Lens Blur from Photoshop into an iPhone, but Apple seems to find ways to implement things.
    As you probably know there is a huge debate about software patents. In this case the software is attempting to replicate a natural phenomena - the visual effect that is created by a lens. That would be like patenting the laws of physics. Copyright is a different matter as that would involve copying code or UI. There are so many ways to skin a cat that it would be unlikely to present a problem. Apple would probably write it in Swift and I'm pretty sure Photoshop is all C++.
    Probably explains why Apple is apparently using some type of custom blur. Matthew Panzerino, the  author of the linked article, spoke of confirming with Apple in a series of tweets. 

    ETA: he also commented on how warm the A10 got so real-time processing was hitting the processor hard.
    edited September 2016
  • Reply 60 of 63
    kevin kee said:
    I just noticed something significant that people has missed. In the image of the woman with a baby in the article, if you check the background, the degree of blurriness vary depending on the distant. For example, the tree is less blurry while the furthest scenery is blurred at max. I think this is a very smart implementation compare to the normal "blur in the background" or even Photoshop post processing which just blurred everything in a same degree regardless of the distant. Also, another thing, in the image of two puppies, noticed that the iPhone recognise the objects (puppies) in the mid-distant, and blurred the foreground slightly as well as the background.
    Really should read the linked article. Does a good explanation about "slices" and how It allows selective processing of the background. 
Sign In or Register to comment.