Nice article but how do you explain away the smoothing with smart hdr off? The chinese phones have had this heavy noise reduction for years- they call it beauty mode . I guess you call it a new feature?
You’re quite wrong. This photo app developer has written all about it, and his own camera app doesn’t use Smart HDR, resulting in noisy RAWs, then applies his own effects. The XS doesn’t apply the noise reduction (“smoothing”) when Smart HDR is disabled.
I have a gut feeling that the explanations given by that developer should be taken with a grain of salt.
What doesn't fit is his "high ISO introduces a lot of noise then Apple smooths the image to remove that noise" claim. I don't think he's talking based on a white paper from Apple, those are just his interpretations of the photos. That doesn't fit because with such a powerful CPU, neural engine and a brand new image signal processor, the noise issue might not have such a weight as to dramatically affect underlying algorithms and the resulting photos. Besides, with a Apple-developed image signal processor talking about noise makes no sense at all.
"The image signal processor features a refined depth engine, which captures extraordinary detail in Portrait mode. And with Smart HDR, you’ll notice far greater dynamic range in your photos." https://www.apple.com/iphone-xs/a12-bionic/
There are no noise and no image smoothing at all, IMHO. Enlarge the XS photos to the max and watch how pixels are aligned at the borders. There are no fuzziness, no stray pixels, no "unsharp mask" outline. All pixels align perfectly and smoothly the most natural way, as if drawn by an artistic hand, which is nothing more than the image signal processor of A12.
Noise is inherent in CMOS and CCD sensors, especially at high ISO. Having no noise would be a holy grail of sorts.
The flaws you are looking for would arise from post-processing corrections, not signal/image processing hardware like Apple's, or Canon's DIGIC or Nikon's Expeed which are tied directly to the sensors and a inherently integral part of the A/D conversion process.
Noise is inherent in CMOS and CCD sensors then what? The image is supplied to you by the image signal processor of A12, not by the sensor, no one has access to the sensor, to get "its noise". Between the sensor and ISP there is no image yet, just an array of bits. Of the flaws I listed as example, I am almost certain that in case of old cameras, the local contrast (unsharp mask outline) is somewhat introduced during the very early capture/build up of the image by means of old ISPs, since it is so widespread.
My point is, the lack of local contrast which is presented as a flaw is actually just the opposite: the local contrast itself is a flaw inherent to the old/outdated camera/ISP tech.
Nice article but how do you explain away the smoothing with smart hdr off? The chinese phones have had this heavy noise reduction for years- they call it beauty mode . I guess you call it a new feature?
You’re quite wrong. This photo app developer has written all about it, and his own camera app doesn’t use Smart HDR, resulting in noisy RAWs, then applies his own effects. The XS doesn’t apply the noise reduction (“smoothing”) when Smart HDR is disabled.
I have a gut feeling that the explanations given by that developer should be taken with a grain of salt.
What doesn't fit is his "high ISO introduces a lot of noise then Apple smooths the image to remove that noise" claim. I don't think he's talking based on a white paper from Apple, those are just his interpretations of the photos. That doesn't fit because with such a powerful CPU, neural engine and a brand new image signal processor, the noise issue might not have such a weight as to dramatically affect underlying algorithms and the resulting photos. Besides, with a Apple-developed image signal processor talking about noise makes no sense at all.
"The image signal processor features a refined depth engine, which captures extraordinary detail in Portrait mode. And with Smart HDR, you’ll notice far greater dynamic range in your photos." https://www.apple.com/iphone-xs/a12-bionic/
There are no noise and no image smoothing at all, IMHO. Enlarge the XS photos to the max and watch how pixels are aligned at the borders. There are no fuzziness, no stray pixels, no "unsharp mask" outline. All pixels align perfectly and smoothly the most natural way, as if drawn by an artistic hand, which is nothing more than the image signal processor of A12.
Noise is inherent in CMOS and CCD sensors, especially at high ISO. Having no noise would be a holy grail of sorts.
The flaws you are looking for would arise from post-processing corrections, not signal/image processing hardware like Apple's, or Canon's DIGIC or Nikon's Expeed which are tied directly to the sensors and a inherently integral part of the A/D conversion process.
Noise is inherent in CMOS and CCD sensors then what? The image is supplied to you by the image signal processor of A12, not by the sensor, no one has access to the sensor, to get "its noise". Between the sensor and ISP there is no image yet, just an array of bits. Apple's ISP is not tied directly to the sensor, it resides in A12.
The film analogy would be reducing grain (noise) in the film itself, in the film processing (using difference developers/different chemical temperatures) or in the enlargement process (aka making prints)
Absolute highest quality will come with better film (better sensors) but the development process (processing hardware) allows a direct chemical action on a latent image, so it is inherently a better time to address short comings than in the "making prints" stage (aka in photoshop, Photos, etc)
Most people mix up dithering and noise. Noise is random, chaotic, stray, and apparently there is no camera sensor immune to it. Dithering, although being basically noise, is intentional, algorithmic, ordered and is a solution to represent tonal details. You can remove noise by filters. You cannot remove dithering, or you wouldn't want to, because if you try to remove dithering you get banding effects, called also "posterization" in an old jargon.
So what happens in low light conditions? Your camera sensor may show a lot of noise, true noise, "bad noise", random, chaotic. But your image probably will include much more "noise" than that: in the form of dithering. Dithering is "good noise", it serves to suppress the banding effect and to represent smooth tonal transitions. But people see dithering in a low light photo and scream "there!... a lot of noise!..."
To keep it short, if one does not pay attention to the difference between dithering and noise, the whole discourse becomes meaningless.
Dithering traditionally is prevented by switching to a different color palette but that may still cause the loss of details and tonal values. The most efficient way to prevent (or reduce) dithering is HDR.
Would be nice to see a camera review where the reviewer took photos with similar subjects to what normal/consumers shoot - like group shots in really bad lighting, blowing out candles in bad lighting, children that won’t stop moving etc.
There is nothing to be done with children that can't stop moving in bad lighting, except a flash. It's going to be blurry.
How do you think the camera can expose the scene correctly and also expose something that would need a much longer exposure.
Short exposure that will stop noise will give a mess of noise to the subject, though I guess HDR could blend it with a longer exposure of the whole scene so it comes out all right and the subject would become real soft (if they stack both exposures and smooth the subject so it is not a pixelated mess).
Comments
My point is, the lack of local contrast which is presented as a flaw is actually just the opposite: the local contrast itself is a flaw inherent to the old/outdated camera/ISP tech.
Absolute highest quality will come with better film (better sensors) but the development process (processing hardware) allows a direct chemical action on a latent image, so it is inherently a better time to address short comings than in the "making prints" stage (aka in photoshop, Photos, etc)
Most people mix up dithering and noise. Noise is random, chaotic, stray, and apparently there is no camera sensor immune to it. Dithering, although being basically noise, is intentional, algorithmic, ordered and is a solution to represent tonal details. You can remove noise by filters. You cannot remove dithering, or you wouldn't want to, because if you try to remove dithering you get banding effects, called also "posterization" in an old jargon.
So what happens in low light conditions? Your camera sensor may show a lot of noise, true noise, "bad noise", random, chaotic. But your image probably will include much more "noise" than that: in the form of dithering. Dithering is "good noise", it serves to suppress the banding effect and to represent smooth tonal transitions. But people see dithering in a low light photo and scream "there!... a lot of noise!..."
To keep it short, if one does not pay attention to the difference between dithering and noise, the whole discourse becomes meaningless.
Dithering traditionally is prevented by switching to a different color palette but that may still cause the loss of details and tonal values. The most efficient way to prevent (or reduce) dithering is HDR.