Apple's first iOS 10.1 beta enables iPhone 7 Plus bokeh 'Portrait' mode

24

Comments

  • Reply 21 of 63
    zroger73 said:
    True Bokeh isn't simply a blur.  

    Aaaaand this is why art people are so annoying..
    Well, 'tis true that there's more to bokeh than averaging pixels. neutral 

    Yes, we are all annoying in our own way goodbyeranch. I would say the new Portrait mode simulates depth of field but not necessarily bokeh, as the author of this article claims. Bokeh is a specific type of background blur, usually involving circles (like the ones in digitalclips' parrot photo), which can be manipulated using depth of field and different types of lenses, but they are not the same thing. Anyway, I think Portrait mode is a great new feature that I've been looking forward too. Yes, it's not prefect and prone to glitches, but I'd still rather have the feature on a point-and-shoot camera than not. Here is an article that does a pretty good job of explaining both depth of field and bokeh for those not completely familiar with the terms and how they are different.

    http://digital-photography-school.com/understanding-depth-field-beginners/
    edited September 2016 nolamacguy
  • Reply 22 of 63
    mike1mike1 Posts: 3,275member
    The fact that there is even a discussion like this regarding a camera in A PHONE is amazing in itself.
    The fact that an average non-hobbyist is being exposed to this technique/feature and will be able to take better portraits where the subject pops is pretty cool.
    edited September 2016 matrix077
  • Reply 23 of 63
    Hope Apple will work on bringing the blur closer to bokeh. And hope it is possible to do it in software.
  • Reply 24 of 63
    MacProMacPro Posts: 19,718member
    matrix077 said:
    shogun said:
    The TechCrunch article actually explains the effect quite well: https://techcrunch.com/2016/09/21/hands-on-with-the-iphone-7-plus-crazy-new-portrait-mode/
    Good article and note it never used the term 'bokeh'. 
    Because there is no bokeh. Your parrot photo however has many. 
    My point exactly.  I wasn't criticizing Apple, rather AI for incorrectly using the term bokeh when they meant back ground blur.
    argonaut
  • Reply 25 of 63
    MacProMacPro Posts: 19,718member

    True Bokeh isn't simply a blur.  

    Aaaaand this is why art people are so annoying..
    I assume you don't like to learn.
    argonaut
  • Reply 26 of 63
    volcanvolcan Posts: 1,799member
    polymnia said:
     I don't want to minimize the complexity (and probably engineering around Adobe patents) that would be involved in replicating the Lens Blur from Photoshop into an iPhone, but Apple seems to find ways to implement things.
    As you probably know there is a huge debate about software patents. In this case the software is attempting to replicate a natural phenomena - the visual effect that is created by a lens. That would be like patenting the laws of physics. Copyright is a different matter as that would involve copying code or UI. There are so many ways to skin a cat that it would be unlikely to present a problem. Apple would probably write it in Swift and I'm pretty sure Photoshop is all C++.
    edited September 2016 nolamacguy
  • Reply 27 of 63
    roake said:
    I wonder if we will see features like post-photo refocusing, 3D photos (perhaps with parallax effect), etc.
    You will not see that. In order to do that, you need to have a bit more than just two cameras.
  • Reply 28 of 63
    I want it to be able to do "Instant Alpha" like Keynote can - make the background transparent so the foreground can be easily composited onto another background.
  • Reply 29 of 63
    volcan said:
    polymnia said:
     I don't want to minimize the complexity (and probably engineering around Adobe patents) that would be involved in replicating the Lens Blur from Photoshop into an iPhone, but Apple seems to find ways to implement things.
    As you probably know there is a huge debate about software patents. In this case the software is attempting to replicate a natural phenomena - the visual effect that is created by a lens. That would be like patenting the laws of physics. Copyright is a different matter as that would involve copying code or UI. There are so many ways to skin a cat that it would be unlikely to present a problem. Apple would probably write it in Swift and I'm pretty sure Photoshop is all C++.
    Bingo. If only the courts and USPTO understood this simple concept. Code is speech. Speech is protected by copyright. 
  • Reply 30 of 63
    Rayz2016Rayz2016 Posts: 6,957member
    roake said:
    I wonder if we will see features like post-photo refocusing, 3D photos (perhaps with parallax effect), etc.
    You will not see that. In order to do that, you need to have a bit more than just two cameras.
    Doesn't the Lytro camera do post-photo refocusing?
  • Reply 31 of 63
    lepton said:
    I want it to be able to do "Instant Alpha" like Keynote can - make the background transparent so the foreground can be easily composited onto another background.
    Chroma keying, maybe?
  • Reply 32 of 63
    volcanvolcan Posts: 1,799member
    anton zuykov said:

    I want it to be able to do "Instant Alpha" like Keynote can - make the background transparent so the foreground can be easily composited onto another background.
    They could probably offer that feature as they have apparently mastered the mask part.
    anton zuykov said:

    Chroma keying, maybe?
    The use of a green screen, so to speak, is usually used to mask people in motion and it does a pretty good job of knocking out the background for video but the edges are a little ragged. In this case you want the background, only altered. Anyone who has used Photoshop to create a mask, either with channels, the pen tool, magnetic lasso or the magic wand, knows how difficult it is to cut around frizzy hair. The mask in the photo of the dogs is almost perfect. Only very slight blurring of the fly away hairs of the dog on the left. As others have mentioned, they now need to work on more realistic circles of confusion in the background as shown in the parrot image. 
    edited September 2016
  • Reply 33 of 63
    MacProMacPro Posts: 19,718member
    Rayz2016 said:
    roake said:
    I wonder if we will see features like post-photo refocusing, 3D photos (perhaps with parallax effect), etc.
    You will not see that. In order to do that, you need to have a bit more than just two cameras.
    Doesn't the Lytro camera do post-photo refocusing?
    Yep.  I was hoping Apple were going that route, maybe they will in the future.
  • Reply 34 of 63
    MacProMacPro Posts: 19,718member
    matrix077 said:
    shogun said:
    matrix077 said:
    Woah... how can they mask the subject in real time?
    I presume the two cameras create a depth map and then software applies blur according to a distance algorithm. Remember years ago when Jony explained that the blur in iOS and macOS was non-obvious and actually took a lot of effort to get right? They may be using the same blur technique here.
    Using Photoshop, masking the subject and tree behind her will take me an hour at least, and that's a rough mask. We're talking about hair and trunk here.
    The blur is the easiest part. 
    You could do that picture of the woman in Photoshop in <1 minute.  If you look closely the hair is blurred at the edges too and the tree isn't masked at all, it's not a very accurate mask such as some of the add on plug-ins for Photoshop can achieve, not to mention in Photoshop, once masked you could use a tilt or field blur to make the DOF look more realistic.  That said, I'm sure it's quite good enough for most folks and a pleasing effect for some images.
    edited September 2016
  • Reply 35 of 63
    sog35 said:
    This is amazing stuff. Look at some of the pictures on the techcrunch article. wow.



    I want to see how it performs with (1) a person with a lot of loose hair strands in front of a background to be blurred out and (2) what it does when something semi-transparent is in front of a background which is blurred out. Since there's no real way for the software to really determine  the depth of what it's looking at, I think it will have trouble with both of these examples.

    EDIT:  Yes, it does pretty well all things considered. There are still problem areas that trip up the effect and since there is no actual z-depth information being collected and it's relying on machine intelligence to determine what is in front and in back, it's obviously not going to be perfect. Here's the link to the article cited above:  

    https://techcrunch.com/2016/09/21/hands-on-with-the-iphone-7-plus-crazy-new-portrait-mode/
    edited September 2016
  • Reply 36 of 63
    lepton said:
    I want it to be able to do "Instant Alpha" like Keynote can - make the background transparent so the foreground can be easily composited onto another background.
    Chroma keying, maybe?
    I think the commenter is suggesting Depth Keying.

    Another interesting idea of something that could be done with the depth map.

    Like I mentioned in an earlier comment, this Depth Map could be used in all kinds of creative ways.
  • Reply 37 of 63
    Wow, I actually don't like that very much. It looks like someone heavily blurred it out in post-processing.
  • Reply 38 of 63
    True Bokeh isn't simply a blur.  



    I know, right. I can't see how it's possible for them to ever get accurate artificial bokeh, because of the edge problem of masking. If the edge is not sharp on the part that is meant to be sharp then it just looks like an amateur photoshop job. This is an effect that *is* worth pursuing but they should not release it until it's fully formed. They have to engineer fake shutter blade effects as well as some other work, however having said that, even done badly it's an improvement, but it should not be called bokeh, it should just be called something like 'portrait enhancement' mode.
    argonaut
  • Reply 39 of 63
    Rayz2016 said:
    roake said:
    I wonder if we will see features like post-photo refocusing, 3D photos (perhaps with parallax effect), etc.
    You will not see that. In order to do that, you need to have a bit more than just two cameras.
    Doesn't the Lytro camera do post-photo refocusing?
    Yep.  I was hoping Apple were going that route, maybe they will in the future.
    I thought Lytro use special hexagonal lens? I don't think it's only the software.
  • Reply 40 of 63
    lepton said:
    I want it to be able to do "Instant Alpha" like Keynote can - make the background transparent so the foreground can be easily composited onto another background.
    Chroma keying, maybe?
    Yeah, if they can do this thing, they can chroma key the background and do compositing, even inserting extra fake light sources and having it light realistically the scene (since they got a depth map, you can then introduce a realistic lighting post photo).

    In fact, with a depth map, you could do a lot of very very cool effects; many that make current photo filters look sophomoric.

    Apple or third parties could even create "Scenes" with their own depth maps; you could then be mapped into the environment, a kind of very interesting Augmented reality.
Sign In or Register to comment.