Halide is a very good investment in even more camera control for a mere $5.99.
It's nice to have the option but I was wondering if the hedgehog shot could be done in Aperture mode, instead of Portrait mode to get the blurred background using the stock camera app.
Aperture is a realtime adjustment within Portrait mode and only supports humans on the iPhone XR. Halide supports other animals and objects, though not necessarily successfully. YMMV
I don't have an XR (single lens) to confirm any of this.
Halide works with features as available from each iPhone generation, so I can't do everything that newer models can.
In this blind A-B test, there are 15 images shot with each phone. If you want to play along, make a note of which images you prefer from each pair. The devices are identified below the gallery, so be careful if you don't want to spoil the surprise.
On that YouTube vid immediately before this paragraph, the preview image spoils that the left hand (A) image is the iPhone, and the right-side (B) image is the Google Pixel 3. Really, guys. If you're seriously trying to do an A-B comparison, at least make sure you frame the photos identically and switch the sides from time to time. <hugeeyeroll.jpg>
I don’t get why blurring the background is a positive. I’d want to turn that “feature” off.
It must be some “artistic” crap, like black & white images vs. color. Color is better... period. Picture clarity is better... period. Realistic colors vs. extra vibrant is subjective. The default should as accurate as possible...
The guy on stage (zoom) is the only place where the iPhone falls short in an obvious way.
If you're just joking, please ignore my post. If not, the reason blur is used, is that it simulates how we see the real world. If you are looking at a person directly, you are focused on them and not the background. When you say the default should be "accurate" it's not always as simple as it may initially appear to decide on what that means, e.g. does that mean a wide angle should never be used as that's not the perspective we saw the moment in? You can't focus on a whole mountain in one glance. What about white balance? Our brain compensates for different coloured light so if you took an accurate picture under florescent light or even at midday, it wouldn't look anything like the scene we thought we saw. Some techniques are for the sake of art, others are to manipulate what was actually there so it looks more like what we thought we saw.
“Comparing the iPhone XS Max's camera to the shooter on Google's Pixel 3 XL is a bit unfair, given the iPhone has a telephoto lens and costs $200 more.“
A similar preamble wasn’t provided when the Galaxy S9 was compared to iPhone X earlier this year even though iPhone X was nearly a year old. Why do competitors get thrown a little mercy while Apple doesn’t?
Moving on. For me iPhone XR received 8 votes while the Pixel 3 received 4 votes.
I struggled with images 4 and 5. They both received an asterisk. I chose B for image 5, A for image 6.
Halide is a very good investment in even more camera control for a mere $5.99.
It's nice to have the option but I was wondering if the hedgehog shot could be done in Aperture mode, instead of Portrait mode to get the blurred background using the stock camera app.
Aperture is a realtime adjustment within Portrait mode and only supports humans on the iPhone XR. Halide supports other animals and objects, though not necessarily successfully. YMMV
I don't have an XR (single lens) to confirm any of this.
Halide works with features as available from each iPhone generation, so I can't do everything that newer models can.
Ah. Ok. I thought Aperture mode was a different mode to Portrait Mode.
I don’t get why blurring the background is a positive. I’d want to turn that “feature” off.
It must be some “artistic” crap, like black & white images vs. color. Color is better... period. Picture clarity is better... period. Realistic colors vs. extra vibrant is subjective. The default should as accurate as possible...
The guy on stage (zoom) is the only place where the iPhone falls short in an obvious way.
With computational imaging, a "plausible" image is valued above an "accurate" one in the marketplace of smartphones.
White balance in iPhone XR is really lacking comparing to Pixel. It feels more like iPhone5s or old Sony TV with that bias and attempt to correct. Some low light is good, but others... hmmm.
And it is not "blur", but finding subject and taking picture properly. People too frequently do not understand photography and pay attention to depth and background details. Do you really create wallpaper or subject focus photography? Did you take any classes on photography or just take meaningless snapshots? What is your message and story you want to tell with your photographs?
The XS may be more expensive, but it is also more premium in many ways other than the extra camera. It has a better screen, much much faster processor, faceID, steel frame, better/stronger glass, and greater battery life. Whereas the Pixel is marketed as a Phone that is mostly for those that care about the still camera as #1 or #2 priority.
The images are very similar besides the zoom capability of the Pixel (XS is better at this though) and the fact that XR can't do portrait mode on objects other than faces (XS is better at this as well). The XR is cheaper, has iOS, a full screen design and faceID, better video, faster... so what if its camera is missing a couple features...it's called product differentiation. That's what the XS is for.
My wife has iPhone XS and I just switched from long line of Nexus phones to iPhone XR. I don't own Pixel 3 or 3XL but the comparison between two iPhones makes me think that XR's camera deficiencies are actually software and not hardware related. So here are my quick findings.
Main difference between the two phones is that XR requires equally dispersed light source to create a great photo. XS will handle what ever light source is present and this doesn't mean only great night shots. Even during the day, any brighter shopfront neon sign or window will throw XR's values off balance and result in washed out photos. My XR night shots with a couple of bright lights were quite honestly embarrassing. XS handled the same scenes like a champ.
Today I took a picture of bright sky with clouds from within normally lit atrium, XS captured accurate blue color and clear and sharp edges of different light reflections within the window frame, XR showed blue-to-light green-to-white smudges.
One interesting thing I saw: sometimes for kicks I will open Google Photos and do an auto enhance on a photo - often this indeed makes them look nicer, at least to my untrained eye. I took two photos at the same time holding two phones next to each other. When I did auto enhance on a picture XR made it made shadows brighter and overall made picture pop more. But the same command also significantly changed overall hue of the XS' photo. I'm not an expert but this made me think that information that XS and XR are capturing is not quite the same.
Finally, XR's photos are much less sharp compared to ones XS makes.
All settings, including Smart HDR have been set the same between both phones.
Given that Pixel 3 has nearly identical camera but handles these much better (IMHO!) than XR I hope this is something Apple can address with a software update.
I don’t get why blurring the background is a positive. I’d want to turn that “feature” off.
It must be some “artistic” crap, like black & white images vs. color. Color is better... period. Picture clarity is better... period. Realistic colors vs. extra vibrant is subjective. The default should as accurate as possible...
The guy on stage (zoom) is the only place where the iPhone falls short in an obvious way.
If you're just joking, please ignore my post. If not, the reason blur is used, is that it simulates how we see the real world. If you are looking at a person directly, you are focused on them and not the background. When you say the default should be "accurate" it's not always as simple as it may initially appear to decide on what that means, e.g. does that mean a wide angle should never be used as that's not the perspective we saw the moment in? You can't focus on a whole mountain in one glance. What about white balance? Our brain compensates for different coloured light so if you took an accurate picture under florescent light or even at midday, it wouldn't look anything like the scene we thought we saw. Some techniques are for the sake of art, others are to manipulate what was actually there so it looks more like what we thought we saw.
In photography it depends on what the photographer wants. A photo doesn’t have to = what the naked eye sees. In photography it depends on the choice of the person holding the camera. If the photographer wants greater depth of field (things to be in focus), then lenses and f-stops can be used to create that greater depth fo field. As an example of depth of field using cameras would be films. Take “Citizen Kane”, there are multiple portrait shots in that movie with extended depth of field. And that film is considered to have brilliant camera work. * What’s important here is that you are not wrong about what applies to you. But every photographer can wish for whatever they want. Every portrait doesn’t have fit the standard short telephoto lens with background blur (bokeh). Seanismorris wants the background blur off. That is a sensible request for that person. One nice thing about the new iPhones is that the amount of blur can be adjusted before the picture is taken.
White balance in iPhone XR is really lacking comparing to Pixel. It feels more like iPhone5s or old Sony TV with that bias and attempt to correct. Some low light is good, but others... hmmm.
Are you blind, it's opposite day, get a clue. It's even the opposite of what's said in the review.
I don’t get why blurring the background is a positive. I’d want to turn that “feature” off.
It must be some “artistic” crap, like black & white images vs. color. Color is better... period. Picture clarity is better... period. Realistic colors vs. extra vibrant is subjective. The default should as accurate as possible...
The guy on stage (zoom) is the only place where the iPhone falls short in an obvious way.
If you're just joking, please ignore my post. If not, the reason blur is used, is that it simulates how we see the real world. If you are looking at a person directly, you are focused on them and not the background. When you say the default should be "accurate" it's not always as simple as it may initially appear to decide on what that means, e.g. does that mean a wide angle should never be used as that's not the perspective we saw the moment in? You can't focus on a whole mountain in one glance. What about white balance? Our brain compensates for different coloured light so if you took an accurate picture under florescent light or even at midday, it wouldn't look anything like the scene we thought we saw. Some techniques are for the sake of art, others are to manipulate what was actually there so it looks more like what we thought we saw.
In photography it depends on what the photographer wants. A photo doesn’t have to = what the naked eye sees. In photography it depends on the choice of the person holding the camera. If the photographer wants greater depth of field (things to be in focus), then lenses and f-stops can be used to create that greater depth fo field. As an example of depth of field using cameras would be films. Take “Citizen Kane”, there are multiple portrait shots in that movie with extended depth of field. And that film is considered to have brilliant camera work. * What’s important here is that you are not wrong about what applies to you. But every photographer can wish for whatever they want. Every portrait doesn’t have fit the standard short telephoto lens with background blur (bokeh). Seanismorris wants the background blur off. That is a sensible request for that person. One nice thing about the new iPhones is that the amount of blur can be adjusted before the picture is taken.
I think we agree on most of what you're saying but there is a difference between a sensible request and calling blur artistic crap and exclaiming that picture clarity is better...period.
Comments
I don't have an XR (single lens) to confirm any of this.
Halide works with features as available from each iPhone generation, so I can't do everything that newer models can.
Most of the differences seem to relate to post-processing/software choices.
RAW shots, third party apps should balance it out.
On that YouTube vid immediately before this paragraph, the preview image spoils that the left hand (A) image is the iPhone, and the right-side (B) image is the Google Pixel 3.
Really, guys. If you're seriously trying to do an A-B comparison, at least make sure you frame the photos identically and switch the sides from time to time. <hugeeyeroll.jpg>
A similar preamble wasn’t provided when the Galaxy S9 was compared to iPhone X earlier this year even though iPhone X was nearly a year old. Why do competitors get thrown a little mercy while Apple doesn’t?
Moving on. For me iPhone XR received 8 votes while the Pixel 3 received 4 votes.
I struggled with images 4 and 5. They both received an asterisk. I chose B for image 5, A for image 6.
Main difference between the two phones is that XR requires equally dispersed light source to create a great photo. XS will handle what ever light source is present and this doesn't mean only great night shots. Even during the day, any brighter shopfront neon sign or window will throw XR's values off balance and result in washed out photos. My XR night shots with a couple of bright lights were quite honestly embarrassing. XS handled the same scenes like a champ.
Today I took a picture of bright sky with clouds from within normally lit atrium, XS captured accurate blue color and clear and sharp edges of different light reflections within the window frame, XR showed blue-to-light green-to-white smudges.
One interesting thing I saw: sometimes for kicks I will open Google Photos and do an auto enhance on a photo - often this indeed makes them look nicer, at least to my untrained eye. I took two photos at the same time holding two phones next to each other. When I did auto enhance on a picture XR made it made shadows brighter and overall made picture pop more. But the same command also significantly changed overall hue of the XS' photo. I'm not an expert but this made me think that information that XS and XR are capturing is not quite the same.
Finally, XR's photos are much less sharp compared to ones XS makes.
All settings, including Smart HDR have been set the same between both phones.
Given that Pixel 3 has nearly identical camera but handles these much better (IMHO!) than XR I hope this is something Apple can address with a software update.
As an example of depth of field using cameras would be films. Take “Citizen Kane”, there are multiple portrait shots in that movie with extended depth of field. And that film is considered to have brilliant camera work.
* What’s important here is that you are not wrong about what applies to you.
But every photographer can wish for whatever they want. Every portrait doesn’t have fit the standard short telephoto lens with background blur (bokeh). Seanismorris wants the background blur off. That is a sensible request for that person.
One nice thing about the new iPhones is that the amount of blur can be adjusted before the picture is taken.
And in this comparison the Xr does a better job with that.