Comparing photography: iPhone XS Max versus Google Pixel 3 XL

2

Comments

  • Reply 21 of 45
    saltyzip said:
    From all the reviews I've seen the iPhone photos are just fake, all the colour reproductions are wrong. Black tops look grey, thursty grass looks green. iPhone is just auto adding filters and beautifying.

    When I take a picture I want it to represent what I am seeing through my own eyes, not something some apple nerd thinks will look good on Instagram.
    This is hilarious. Also, nice to see a few 1 post bandits show up. 
    watto_cobra
  • Reply 22 of 45
    gatorguygatorguy Posts: 21,106member
    Madtiger said:
    Night sight...good luck with moving subjects...and better keep steady hands...AI can’t correct all the shakes from the pics I have seen.

    Since it is available on all Pixels...Night Sight is nothing special or requiring anything hard to do.
    Of course not. That's why everyone has it, only the name changes right? 
  • Reply 23 of 45
    gatorguygatorguy Posts: 21,106member
    This is definitely a hot take given how committed (nearly) everyone on the internet is to the Pixel Camera. I think the review video is really balanced and honest. Both cameras take great shots, but I think the 56mm framing of the portrait lens is a very important feature that few reviewers discuss much whereas the wide angle lens on the pixels front is nice but daaamn it produces some seriously squished faces. Pixel is better at low light, no doubt. I don’t think this is due to hardware, just they’ve put more attention to their software algorithms for low light cases.
    Being a photographer, I must whole heartedly disagree about the Pixel 3 XL performing better than the XS Max in low light, as the Pixel is still tending to blow out highlights much more than the XS Max. If the highlights get blown, there is no recovery in post. The XS Max is delivering much nicer, balanced images, with better dynamic range.

    The only thing I see disappointing with the XS Max is the low light flash assisted performance, where definitely the attempt at delivering the widest possible dynamic range is working against the phone there, as it delivers dark, washed out images.
    I doubt that most folks "post-process" their smartphone photos other than using pre-defined recipes if anything at all. (Yeah, I'm  photographer too) Capturing a low-light image at all inside some club or out downtown at night will be appreciated. 
    muthuk_vanalingam
  • Reply 24 of 45
    BluntBlunt Posts: 224member
    saltyzip said:
    From all the reviews I've seen the iPhone photos are just fake, all the colour reproductions are wrong. Black tops look grey, thursty grass looks green. iPhone is just auto adding filters and beautifying.

    When I take a picture I want it to represent what I am seeing through my own eyes, not something some apple nerd thinks will look good on Instagram.

    Loser.
    watto_cobra
  • Reply 25 of 45
    chasmchasm Posts: 1,667member
    Given that Apple has acknowledged, and will fix, a bug in the post processing software in iOS 12.1, shouldn’t this video have waited until that happens? Seems like a lot of wasted effort on your part, since the tests will have to be re-done when iOS 12.1 comes out, for at least portrait/selfie/closeup shots.

    And while we’re on the topic of wasted effort .... Apple has also said that portrait mode works primarily with human faces, so the point of the portrait of a weed was ... ? Sure, it’s definitely worth noting that the Pixel 3’s camera does a better job with non-human portraits, but that particular photo compare seemed unfair to me.

    Serious photographers (the kind who also work with expensive “real” cameras; I’m one of those) will dicker on and on about which compromises made by smartphone cameras work best for different situations, but looking at these I think it’s clear that on a perceptual front, the iOS cameras do better in most situations than the Google phones, but in a few areas the perceptual advantage goes to the Pixel 3.

    If you really prioritize a camera that can take the best photos in all situations, get a DSLR and really learn to use it (especially manual mode).

    If you really want to do a useful video while you wait for 12.1 to come out, maybe showing off SmartHDR compared to [however something like that is done with the Note 9 and Pixel 3) would be interesting, especially in video. I’d personally also be interested in seeing a comparison of a given smartphone (take your pick) up against a real DSLR (since phones cost as much or more than a basic DSLR these days!)
    watto_cobra
  • Reply 26 of 45
    KITAKITA Posts: 197member
    clarker99 said:
    As a XS Max user, I am pleasantly surprised given that the Pixel camera is very highly regarded.

    I am interested to see the XR vs. Pixel 3 XL comparison. 
    The Pixel line of phones, the 3 XL included, is only highly regarded by paid-off tech media shills and YouTubers who don't have a fucking clue what they're talking about. Google was caught red-handed during their keynote of the Pixel 3 XL putting up a photo from it vs the XS Max that was obviously fake to make themselves look better, and now is proven with these real-world tests.
    The only person here that "doesn't have a ..." appears to be yourself.

    Google was not "caught red-handed" in their keynote during a comparison between the Pixel 3 and iPhone XS, so don't make things up. 

    As for the comparison:



    The feature they showed off was Night Sight, a feature that Appleinsider staff did not test in their comparison, as it, along with a number of other camera features, have not yet rolled out to the Pixel 3.

    Night Sight, however, has already been shown to be very real in pre-release software. See for yourself: 

    https://www.xda-developers.com/google-pixel-night-sight-google-camera-review/

    https://www.macrumors.com/2018/10/26/google-pixel-night-sight-photos/

    Without Night Sight:


    With Night Sight:

    edited October 2018 saltyzipmuthuk_vanalingam
  • Reply 27 of 45
    Rayz2016Rayz2016 Posts: 4,764member
    AI needs rethink its button strategy for forum postings:

    Like
    Informative
    Butthurt

    Jeffwinwatto_cobra
  • Reply 28 of 45
    Any Night Sight with moving subjects?  It’s good for stationary things...a little soft because AI has to compensate for handshake.  
    edited October 2018 watto_cobra
  • Reply 29 of 45
    KITAKITA Posts: 197member
    Madtiger said:
    Any Night Sight with moving subjects?  It’s good for stationary things...a little soft because AI has to compensate for handshake.  But how about moving subjects, which is more important?
    It will blur moving objects such as a car.

    Not sure about slow moving objects, but I would imagine it would blur. 
    muthuk_vanalingam
  • Reply 30 of 45
    STiKSTiK Posts: 2unconfirmed, member
    I'm both an Apple and Google fanboy. I see advantages of both Apple products and Google/Android. I have to say that expecting a fair and objective review from Apple Insider or its readers is like expecting a Trump fan to give an honest assessment of Obama. This comparison played out exactly as I expected and so did the comments  
    I know right? I personally trust my Pixel to take consistently great shots over my iPhone. Don't get me wrong, the iPhone takes great photos but the Pixel is way more consistent. I'm not sure how they pulled off the dynamic range bias here as my Pixel is much better in this regard. With that said, I use my iPhone for video as it is light years better than what comes from my Pixel. 
    gatorguy
  • Reply 31 of 45
    STiK said:
    I'm both an Apple and Google fanboy. I see advantages of both Apple products and Google/Android. I have to say that expecting a fair and objective review from Apple Insider or its readers is like expecting a Trump fan to give an honest assessment of Obama. This comparison played out exactly as I expected and so did the comments  
    I know right? I personally trust my Pixel to take consistently great shots over my iPhone. Don't get me wrong, the iPhone takes great photos but the Pixel is way more consistent. I'm not sure how they pulled off the dynamic range bias here as my Pixel is much better in this regard. With that said, I use my iPhone for video as it is light years better than what comes from my Pixel. 
    Yes Xs has better dynamic range than Pixel. Shadow details and highlight control. Pixel only has highlight control and lots of contrast. 

    Xs able to do both extremes due to A12. 
    watto_cobra
  • Reply 32 of 45
    KITA said:
    Madtiger said:
    Any Night Sight with moving subjects?  It’s good for stationary things...a little soft because AI has to compensate for handshake.  But how about moving subjects, which is more important?
    It will blur moving objects such as a car.

    Not sure about slow moving objects, but I would imagine it would blur. 
    From Cnet: When taking a Night Sight photo, you need to keep the phone as stable as possible to get the best results. If you're moving around too much, there's a good chance the image will be blurry.

    Verge: People wanted to know how Night Sight handles moving objects, and the answer is it doesn’t. Although it’s not one single long exposure, Google’s night mode still gathers light over a period of a few seconds, and anything moving through the frame in that time will turn into a blur of motion.
    edited October 2018 watto_cobra
  • Reply 33 of 45
    You learn the most interesting things on the internet:

    https://www.nytimes.com/2018/10/25/technology/google-sexual-harassment-andy-rubin.html?module=inline

    Turns out the guy who created Android for Google was kicked out because of misconduct.
    watto_cobra
  • Reply 34 of 45
    entropysentropys Posts: 1,850member
    So what is night sight, just a google take on keeping the ‘shutter’ open for longer? Or maybe combining light from multiple images?
    cool if you have your phone on a tripod with a Bluetooth clicker, and the subject is verrry, verrry still.

    i suspect in iOS there is an app for that. Like this one
    edited October 2018 watto_cobra
  • Reply 35 of 45
    Good alternative review of these two phones

    TrustedReviews: Google Pixel 3 vs Apple iPhone XS: which smartphone has the best camera?.
  • Reply 36 of 45
    entropys said:
    So what is night sight, just a google take on keeping the ‘shutter’ open for longer? Or maybe combining light from multiple images?
    cool if you have your phone on a tripod with a Bluetooth clicker, and the subject is verrry, verrry still.

    i suspect in iOS there is an app for that. Like this one
    Somehow, i am not able to click on your app link.

    Night Sight is more than just long exposure.  For me, it sounds like long exposure + HDR processing so that the bright lights are not blown out in long exposure.  It also uses a touch of AI to brighten up the colors and get rid of minor hand shake.  (Huawei came out with this first but I think Google made it better by having more colors.)

    This is something that Apple can easily do (and improve upon) because it does not require much hardware...even Pixel 1st gen can do this.  This goes against what the media is saying how Google AI is the magic behind this...not really because you can have this on Pixel 1, which does NOT have Pixel Visual Core chip.

    It is a nice feature to have...but would you use it often??  It does take 3-5 seconds (depends on how dark it is).  So, forget about using this to take pic of your kids or pets...e.g. things that cannot hold perfectly still for more than a few seconds......using the flash on your iPhone Xs would be better.  For close to medium distance, a flash is going to better...faster time to take and sharper pic.  But for longer distance or you want to take a pic of your entire backyard (aka vast space), then yeah Night Sight is a “game-changer.”

    Now, what is a real game-changer is when someone can do Night Sight but at same time, be able to isolate slow MOVING subjects (like kids) and keep them relatively sharp.  And i think that this is where having Apple A12 and Smart HDR may come in handy...A12 (ISP + NN) already scans the frame pixel by pixel everytime you open the Camera app.  This is where Apple can really show off the A12 (A13?) capability.  I wonder if Apple can use the Telephoto lens during the “Night Sight” to help isolate the subjects in focus at the same time...maybe even create segmentation map to do better isolation and know which needs to be in focus.
    watto_cobra
  • Reply 37 of 45
    clarker99 said:
    saltyzip said:
    From all the reviews I've seen the iPhone photos are just fake, all the colour reproductions are wrong. Black tops look grey, thursty grass looks green. iPhone is just auto adding filters and beautifying.

    When I take a picture I want it to represent what I am seeing through my own eyes, not something some apple nerd thinks will look good on Instagram.
    This is hilarious. Also, nice to see a few 1 post bandits show up. 
    Just look at the difference in the colours of the photos, it's like they are taken at a different time of day, nobody mentioned in the review which is most realistic to the original setting, just which one looks nicer, hence my point.
    edited October 2018
  • Reply 38 of 45
    Blunt said:
    saltyzip said:
    From all the reviews I've seen the iPhone photos are just fake, all the colour reproductions are wrong. Black tops look grey, thursty grass looks green. iPhone is just auto adding filters and beautifying.

    When I take a picture I want it to represent what I am seeing through my own eyes, not something some apple nerd thinks will look good on Instagram.

    Loser.
    Is the jacket grey or is the jacket black, I'm betting it's black, which to me means the iPhone picture is fake!


  • Reply 39 of 45
    Blunt said:
    saltyzip said:
    From all the reviews I've seen the iPhone photos are just fake, all the colour reproductions are wrong. Black tops look grey, thirsty grass looks green. iPhone is just auto adding filters and beautifying.

    When I take a picture I want it to represent what I am seeing through my own eyes, not something some apple nerd thinks will look good on Instagram.

    Loser.

    Exhibit 2 below, night and day difference.It's not all about looks it's about accuracy, hope you're better educated now, as your comment added no value.


  • Reply 40 of 45
    KITAKITA Posts: 197member
    Madtiger said:
    entropys said:
    So what is night sight, just a google take on keeping the ‘shutter’ open for longer? Or maybe combining light from multiple images?
    cool if you have your phone on a tripod with a Bluetooth clicker, and the subject is verrry, verrry still.

    i suspect in iOS there is an app for that. Like this one
    Somehow, i am not able to click on your app link.

    Night Sight is more than just long exposure.  For me, it sounds like long exposure + HDR processing so that the bright lights are not blown out in long exposure.  It also uses a touch of AI to brighten up the colors and get rid of minor hand shake.  (Huawei came out with this first but I think Google made it better by having more colors.)

    This is something that Apple can easily do (and improve upon) because it does not require much hardware...even Pixel 1st gen can do this.  This goes against what the media is saying how Google AI is the magic behind this...not really because you can have this on Pixel 1, which does NOT have Pixel Visual Core chip.

    It is a nice feature to have...but would you use it often??  It does take 3-5 seconds (depends on how dark it is).  So, forget about using this to take pic of your kids or pets...e.g. things that cannot hold perfectly still for more than a few seconds......using the flash on your iPhone Xs would be better.  For close to medium distance, a flash is going to better...faster time to take and sharper pic.  But for longer distance or you want to take a pic of your entire backyard (aka vast space), then yeah Night Sight is a “game-changer.”

    Now, what is a real game-changer is when someone can do Night Sight but at same time, be able to isolate slow MOVING subjects (like kids) and keep them relatively sharp.  And i think that this is where having Apple A12 and Smart HDR may come in handy...A12 (ISP + NN) already scans the frame pixel by pixel everytime you open the Camera app.  This is where Apple can really show off the A12 (A13?) capability.  I wonder if Apple can use the Telephoto lens during the “Night Sight” to help isolate the subjects in focus at the same time...maybe even create segmentation map to do better isolation and know which needs to be in focus.
    The Pixel Visual Core and A12 are being used for inference (not training). Google's ability to design and train a model is a major part of their success. The computing performance required for inference of a number of these features doesn't have to be particularly high if the model has been well trained. Features, such as Google's real-time offline music recognition, would run purely on the Snapdragon SoC using a local library. Prior to the Pixel 3, the only use Google had for the Pixel Visual Core was to apply HDR+ to 3rd party camera applications (Snapchat, Instagram, etc.).

    It's worth noting that the Night Sight feature was something Google has been working on for a while. Huawei also takes a different approach to the problem, as they use more than just a single RGB sensor that Google uses.
Sign In or Register to comment.