Hands on: Apple's iPhone XS and XS Max are gorgeous, and a boon for photographers

14567810»

Comments

  • Reply 181 of 187
    gatorguygatorguy Posts: 24,176member
    melgross said:
    gatorguy said:
    tmay said:

    Soli said:
    avon b7 said:
    Soli said:
    avon b7 said:
    Is there a difference between the camera mode you are describing and what Huawei phones have done for years?

    The reason I ask is that when they announced the feature I automatically assumed it would be different but what you describe is practically identical to how Huawei phones have operated since the P9. I thought it was already possible on Apple's dual lens phones.

    [video]

    Or is not so much what it does as how it does it?
    It looks like it's doing the same type of adjustments, but I'm not sure why you mention it. For starters, when has Apple been first to anything, and yet new release you have to claim that some other company—which now seems to be Huawei, a company that makes Samsung look honest—did something first.

    Is Apple's algorithm really as bad as Huawei's? First of all, here's a screenshot of the video you posted where the presenter is showing how you can blur the foreground to make the background come into view yet he background is still blurry as fuck. It seems like it's better for adjusting bokeh, but that's not what the video is doing as noted by the coffee cup getting blurry.

    Finally, what's YOUR point? Why do you keep mentioning other, shittier products in conversations about Apple that were never asked. You're the THIRD reply in this thread! Is your goal to just jack the thread with your nonsense?

    [image]
    My point was in that third post. Asked, explained, asked again, explained again, and again, and again.

    Why don't YOU get it?

    You know, if I explain it again, you still won't get it. Of that I'm sure.
    Let me ask you a question: What's the point of a computational bokeh effect if it's going to blur the primary object you're trying to photograph?

    Here's an example I just took* since I don't think you're understanding the difference between bokeh...

    In this first photograph Schiller shows the original photo with their bokeh effect in play"


    In this second photograph Schiller shows how they've adjust the bokeh effect so that the background stands out more WITHOUT altering the primary subject in the foreground:

    So, why in the world would someone want for the object they took a photo of to become blurry like the background of the photo? These don't have the physical components to mimic Lytro! Bottom line: Huawei ain't got nothing but their typical snake oil to sell.


    * I took screenshots from the September event. I was being cheeky with my word usage.
    Funny thing; I was asking myself that. Being able to add a focus point doesn't really solve any problems, at least that I can see.

    Good point,
    It looks as tho Apple's "bokeh effect" also introduces some softness in the subject as well, softening the overall image and not just the background. Look at the hair detail and eyes in the two images. 
    Tough to tell exactly with those images, but the eyes look fine. The hair is supposed to be out of focus if it’s set to the equivelant of f1.4. The focus depth using a full frame with a 50 f1.4 gives about 1/2” focus at that distance. So, about right.
    The eyes should never go out of focus. And they do. I'm pretty surprised that if you can't see it in the eyes (obvious to me) that you also can't see the very noticeable blurring of the eyeLASHES. I'm quite particular about my images, and assume you are too. 

    Did you mention you already have one of the new phones? Easy enough to do a bit of testing using your photographers eye. 
    edited September 2018
  • Reply 182 of 187
    SoliSoli Posts: 10,035member
    avon b7 said:
    I wasn't claiming Huawei had anything 'first'. Jesus, I thought all iPhones had the feature!
    My mistake. You didn't say Huawei was first in your initial post, only that they were doing years before Apple and implying Apple is woefully behind Hauwei despite the clear differences from the very example you posted.

    You are comparing the results of a three day old phone to the result of a phone released in April 2016! I'd say you weren't exactly very quick on the uptake!
    Now all of a sudden your "this did the same thing years ago and here's my comparison to prove it" has turned into "it's not the same thing because this iPhone is newer so you shouldn't compare them." JFC, Man, stick to a fucking claim. 🤦‍♂️
    watto_cobra
  • Reply 183 of 187
    avon b7avon b7 Posts: 7,622member
    Soli said:
    avon b7 said:
    I wasn't claiming Huawei had anything 'first'. Jesus, I thought all iPhones had the feature!
    My mistake. You didn't say Huawei was first in your initial post, only that they were doing years before Apple and implying Apple is woefully behind Hauwei despite the clear differences from the very example you posted.

    You are comparing the results of a three day old phone to the result of a phone released in April 2016! I'd say you weren't exactly very quick on the uptake!
    Now all of a sudden your "this did the same thing years ago and here's my comparison to prove it" has turned into "it's not the same thing because this iPhone is newer so you shouldn't compare them." JFC, Man, stick to a fucking claim. ߤ榺wj;♂️
    I haven't changed my claim!

    You* directly compared the quality of what you posted to the video I linked to. You missed the point. I posted the video to illustrate the feature. Not the quality. It was simply the first link that came back. Obviously, almost three years later, things have got better. Comparing the results from a late 2018 phone to an early 2016 phone is pointless.

    I wasn't implying anything.

    *Apologies if it wasn't you. The Post Editor sometimes confuses who is being quoted as you draft a reply.




    edited September 2018 spheric
  • Reply 184 of 187
    tmaytmay Posts: 6,309member
    spheric said:
    avon b7 said:
    Soli said:
    avon b7 said:
    Soli said:
    avon b7 said:
    Is there a difference between the camera mode you are describing and what Huawei phones have done for years?

    The reason I ask is that when they announced the feature I automatically assumed it would be different but what you describe is practically identical to how Huawei phones have operated since the P9. I thought it was already possible on Apple's dual lens phones.

    [video]

    Or is not so much what it does as how it does it?
    It looks like it's doing the same type of adjustments, but I'm not sure why you mention it. For starters, when has Apple been first to anything, and yet new release you have to claim that some other company—which now seems to be Huawei, a company that makes Samsung look honest—did something first.

    Is Apple's algorithm really as bad as Huawei's? First of all, here's a screenshot of the video you posted where the presenter is showing how you can blur the foreground to make the background come into view yet he background is still blurry as fuck. It seems like it's better for adjusting bokeh, but that's not what the video is doing as noted by the coffee cup getting blurry.

    Finally, what's YOUR point? Why do you keep mentioning other, shittier products in conversations about Apple that were never asked. You're the THIRD reply in this thread! Is your goal to just jack the thread with your nonsense?

    [image]
    My point was in that third post. Asked, explained, asked again, explained again, and again, and again.

    Why don't YOU get it?

    You know, if I explain it again, you still won't get it. Of that I'm sure.
    Let me ask you a question: What's the point of a  Schiller shows the original photo with their bokeh effect in play"


    In this second photograph Schiller shows how they've adjust the bokeh effect so that the background stands out more WITHOUT altering the primary subject in the foreground:

    So, why in the world would someone want for the object they took a photo of to become blurry like the background of the photo? These don't have the physical components to mimic Lytro! Bottom line: Huawei ain't got nothing but their typical snake oil to sell.


    * I took screenshots from the September event. I was being cheeky with my word usage.
    […]
    Can you see anything there that speaks about what you just posted - and in the context of what you said? Is what you just highlighted anywhere in the article?

    NO!

    My question was related to the article, not what Schiller said. You know that because I said - right here in this thread - that I didn't see the presentation.

    So, what is here - in the article?

    Aperture feature. Changing depth of field.

    What is in my question and link?

    Aperture mode. Changing depth of field.

    That is how I read it anyway.

    Maybe you are starting to understand now?

    I mentioned my 'surprise'. Why do you think that was?

    Well, did you check the date of the video I linked to? Early 2017. Do you know when the P9 was released? April 2016.

    No NPU. No Bionic.

    So, when I read this piece, I say to myself, what's new in that - the feature? Hadn't all dual lens iPhones had that feature from the start?

    I just took for granted they had.

    So, what did I do? I asked a question!

    Brilliant eh?

    Now let me ask you a question.

    Do you agree that it was possible to simply give an answer and clear things up in ONE post?

    Easy, right?

    Okay, so it turns out that apps have existed to exploit the depth of field of dual-camera iPhones for some time, such as Focos:
    https://itunes.apple.com/de/app/focos/id1274938524?mt=8

    What’s new here isn’t so much that the Xs can do this, it’s that Apple is embracing it and adding the effect to the viewfinder IN REAL TIME in a coming update. That’s where they’re going with this, and why they’re only adding it to the default camera on the Xs at this point. 
    Good points.

    I did find an iOS app for Focus Stacking, but it was essentially unusable due to the requirement to use the developer's cloud to compute the stack, and the reviews were poor. I have ZereneStacker on my Mac for such things, but never got around to finding a suitable macro lens that I could afford. Irix has announced a 150mm macro that looks to be just what I want, no pricing yet.

    https://nikonrumors.com/2018/09/24/just-announced-irix-150mm-f-2-8-macro-11-lens.aspx/

  • Reply 185 of 187
    sphericspheric Posts: 2,544member
    melgross said:
    spheric said:
    Bokeh is not “artificial”. 

    It is a natural result of wide apertures having narrow focal depth-of-field. 

    The smaller the aperture, the deeper the in-focus range, and the less blurry the background. 

    Bokeh is as natural as focus or motion blur or lens flare. It can be minimised or provoked within what circumstances allow through photographic skill, but it is a natural physical phenomenon. 

    What these phones do is to FAKE the effect to be more in tune with what we feel to be natural, and to allow us to shift the viewers’ attention more towards the subject of the photo. 
    No, bokeh is artificial. It’s a result of an artifact of an artificial device designed, and made by people. Motion blur and lens flare are both artificial. Anything that comes from a device we make is artificial, and fake. The point is that we don’t see that way, and so anything that we make that provides an image with those artifacts is, by definition, artificial.

    it doesn’t matter whether effect is from a lens or computer, it’s artificial. The important question is whether it’s pleasing. And there, different people have different opinions. None of those opinions are either right or wrong. It’s art, after all, whether it looks good, or bad.
    Yeah, I can go with that. (Only if you agree that poodles are artificial, though: we're harnessing natural physical phenomena — evolution — to create an effect that doesn't occur naturally… I kid, I kid. ;) )

    There is a difference, though, between a physical object creating certain artefacts, and using a computer to simulate the way a physical lens would create them.  
  • Reply 186 of 187
    melgrossmelgross Posts: 33,510member
    gatorguy said:
    melgross said:
    gatorguy said:
    tmay said:

    Soli said:
    avon b7 said:
    Soli said:
    avon b7 said:
    Is there a difference between the camera mode you are describing and what Huawei phones have done for years?

    The reason I ask is that when they announced the feature I automatically assumed it would be different but what you describe is practically identical to how Huawei phones have operated since the P9. I thought it was already possible on Apple's dual lens phones.

    [video]

    Or is not so much what it does as how it does it?
    It looks like it's doing the same type of adjustments, but I'm not sure why you mention it. For starters, when has Apple been first to anything, and yet new release you have to claim that some other company—which now seems to be Huawei, a company that makes Samsung look honest—did something first.

    Is Apple's algorithm really as bad as Huawei's? First of all, here's a screenshot of the video you posted where the presenter is showing how you can blur the foreground to make the background come into view yet he background is still blurry as fuck. It seems like it's better for adjusting bokeh, but that's not what the video is doing as noted by the coffee cup getting blurry.

    Finally, what's YOUR point? Why do you keep mentioning other, shittier products in conversations about Apple that were never asked. You're the THIRD reply in this thread! Is your goal to just jack the thread with your nonsense?

    [image]
    My point was in that third post. Asked, explained, asked again, explained again, and again, and again.

    Why don't YOU get it?

    You know, if I explain it again, you still won't get it. Of that I'm sure.
    Let me ask you a question: What's the point of a computational bokeh effect if it's going to blur the primary object you're trying to photograph?

    Here's an example I just took* since I don't think you're understanding the difference between bokeh...

    In this first photograph Schiller shows the original photo with their bokeh effect in play"


    In this second photograph Schiller shows how they've adjust the bokeh effect so that the background stands out more WITHOUT altering the primary subject in the foreground:

    So, why in the world would someone want for the object they took a photo of to become blurry like the background of the photo? These don't have the physical components to mimic Lytro! Bottom line: Huawei ain't got nothing but their typical snake oil to sell.


    * I took screenshots from the September event. I was being cheeky with my word usage.
    Funny thing; I was asking myself that. Being able to add a focus point doesn't really solve any problems, at least that I can see.

    Good point,
    It looks as tho Apple's "bokeh effect" also introduces some softness in the subject as well, softening the overall image and not just the background. Look at the hair detail and eyes in the two images. 
    Tough to tell exactly with those images, but the eyes look fine. The hair is supposed to be out of focus if it’s set to the equivelant of f1.4. The focus depth using a full frame with a 50 f1.4 gives about 1/2” focus at that distance. So, about right.
    The eyes should never go out of focus. And they do. I'm pretty surprised that if you can't see it in the eyes (obvious to me) that you also can't see the very noticeable blurring of the eyeLASHES. I'm quite particular about my images, and assume you are too. 

    Did you mention you already have one of the new phones? Easy enough to do a bit of testing using your photographers eye. 
    In the small image shown here, it’s hard to tell, as I said. The hair is obvious. It’s funny that most all lenses have a focus shift when stopping down. With fast lenses, focus shift can be more than the depth of field wide open. I’ve often seen an eye go out of focus when stopping down. It’s not as though that isn’t a real world problem. But with this small image, really, to hard to tell. I don’t find any reason he focus should shift, or the in focus area should lose sharpness, unless the focus was on the nose. In that case, the eye is just on the edge of the focus field, and would change slightly.

    I ordered the new watch, but not the phone. We’re waiting until things calm down. I’m just looking at those pics which aren’t direct themselves, but taken off the screen by a camera, which may not be in total focus itself between the two shots, which I’m surprised you didn’t think of.
    edited September 2018
  • Reply 187 of 187
    melgrossmelgross Posts: 33,510member
    tmay said:
    spheric said:
    avon b7 said:
    Soli said:
    avon b7 said:
    Soli said:
    avon b7 said:
    Is there a difference between the camera mode you are describing and what Huawei phones have done for years?

    The reason I ask is that when they announced the feature I automatically assumed it would be different but what you describe is practically identical to how Huawei phones have operated since the P9. I thought it was already possible on Apple's dual lens phones.

    [video]

    Or is not so much what it does as how it does it?
    It looks like it's doing the same type of adjustments, but I'm not sure why you mention it. For starters, when has Apple been first to anything, and yet new release you have to claim that some other company—which now seems to be Huawei, a company that makes Samsung look honest—did something first.

    Is Apple's algorithm really as bad as Huawei's? First of all, here's a screenshot of the video you posted where the presenter is showing how you can blur the foreground to make the background come into view yet he background is still blurry as fuck. It seems like it's better for adjusting bokeh, but that's not what the video is doing as noted by the coffee cup getting blurry.

    Finally, what's YOUR point? Why do you keep mentioning other, shittier products in conversations about Apple that were never asked. You're the THIRD reply in this thread! Is your goal to just jack the thread with your nonsense?

    [image]
    My point was in that third post. Asked, explained, asked again, explained again, and again, and again.

    Why don't YOU get it?

    You know, if I explain it again, you still won't get it. Of that I'm sure.
    Let me ask you a question: What's the point of a  Schiller shows the original photo with their bokeh effect in play"


    In this second photograph Schiller shows how they've adjust the bokeh effect so that the background stands out more WITHOUT altering the primary subject in the foreground:

    So, why in the world would someone want for the object they took a photo of to become blurry like the background of the photo? These don't have the physical components to mimic Lytro! Bottom line: Huawei ain't got nothing but their typical snake oil to sell.


    * I took screenshots from the September event. I was being cheeky with my word usage.
    […]
    Can you see anything there that speaks about what you just posted - and in the context of what you said? Is what you just highlighted anywhere in the article?

    NO!

    My question was related to the article, not what Schiller said. You know that because I said - right here in this thread - that I didn't see the presentation.

    So, what is here - in the article?

    Aperture feature. Changing depth of field.

    What is in my question and link?

    Aperture mode. Changing depth of field.

    That is how I read it anyway.

    Maybe you are starting to understand now?

    I mentioned my 'surprise'. Why do you think that was?

    Well, did you check the date of the video I linked to? Early 2017. Do you know when the P9 was released? April 2016.

    No NPU. No Bionic.

    So, when I read this piece, I say to myself, what's new in that - the feature? Hadn't all dual lens iPhones had that feature from the start?

    I just took for granted they had.

    So, what did I do? I asked a question!

    Brilliant eh?

    Now let me ask you a question.

    Do you agree that it was possible to simply give an answer and clear things up in ONE post?

    Easy, right?

    Okay, so it turns out that apps have existed to exploit the depth of field of dual-camera iPhones for some time, such as Focos:
    https://itunes.apple.com/de/app/focos/id1274938524?mt=8

    What’s new here isn’t so much that the Xs can do this, it’s that Apple is embracing it and adding the effect to the viewfinder IN REAL TIME in a coming update. That’s where they’re going with this, and why they’re only adding it to the default camera on the Xs at this point. 
    Good points.

    I did find an iOS app for Focus Stacking, but it was essentially unusable due to the requirement to use the developer's cloud to compute the stack, and the reviews were poor. I have ZereneStacker on my Mac for such things, but never got around to finding a suitable macro lens that I could afford. Irix has announced a 150mm macro that looks to be just what I want, no pricing yet.

    https://nikonrumors.com/2018/09/24/just-announced-irix-150mm-f-2-8-macro-11-lens.aspx/

    I tried several apps like that, but none were all that great. One even supposedly gave a higher resolution, but really didn’t.

    lenses for my iPhones have been rather Meh. The best macro has been the Moment 25mm macro. The case has two places for the lens, one for each camera. It’s a pretty high quality lens, considering, but has no lights around the lens as my older macro did, for which I can’t seem to get a case for anymore. I forget the name.

    is the Irix for your Nikon, or the iPhone? Seems long for the iPhone, but I can’t seem to remember the name IRIX.
    edited September 2018
Sign In or Register to comment.