Photography expert debunks iPhone XS 'beautygate,' details Apple's software-driven camera ...
Sebastiaan de With, the developer behind popular manual photography app Halide, has been putting the iPhone XS camera system through its paces. In a comprehensive look into Apple's latest shooters, de With provides an overview of computational photography, and explains how noise reduction technology might generate selfie photos that appear artificially enhanced.

iPhone XS and XS Max
In the extensive deep dive, de With outlines why the new iPhone's images look so vastly different to those taken by the iPhone X, even though much of the hardware was carried over from last year. Notably, the developer offers an explanation for an alleged skin-smoothing effect that made headlines over the weekend.
The same holds true for the camera. Instead of simply cramming more pixels into its cameras each year, Apple focuses on improving existing components and, importantly, the software that drives them. As noted by de With, pure physics is quickly becoming the main obstacle as to how far Apple can take its pocket-sized camera platform. To push the art further, the company is increasingly reliant on its software making chops.

Halide photo app
It is this software that has made such a substantial difference for iPhone XS.
SVP of worldwide marketing Phil Schiller during last month's iPhone unveiling discussed the lengths to which the Apple-designed image signal processor (ISP) goes when processing a photo.
With iPhone XS, the camera app starts taking photos as soon as you open the app, before a user even presses the shutter button. Once a photo is snapped, the handset gathers a series of separate images, from underexposed and overexposed frames to those captured at fast shutter speeds. An image selection process then chooses the best candidate frames and combines them together to create an incredibly high dynamic range photo while retaining detail. The process clearly has some unintended consequences.
In service of this goal, the new camera favors a quicker shutter speed and higher ISO. When the shutter speed is increased, less light is captured by the sensor. To compensate for the decrease in light, the camera app increases the ISO, which decides how sensitive the sensor is to light. So less light comes in, but the camera is more sensitive to light and is able to output a properly exposed image.

Sample RAW photos from iPhone X (left top and bottom) and iPhone XS RAW. | Source: Sebastiaan de With
However, as the ISO increases, the more noise starts to appear in the photo. That noise needs to be removed somehow, which is where Apple's Smart HDR and computational photography comes in to play.
More noise reduction leads to a slightly more smooth-looking image. This is part of the issue seen in the selfie-smoothing "problem" people have been reporting.
The second reason has to do with contrast.
"Put simply, a dark or light outline adjacent to a contrasting light or a dark shape. That local contrast is what makes things look sharp," de With says.
When a sharpening effect is applied to a photo, no details are actually being added, but instead the light and dark edges are boosted, creating more contrast and thus the illusion that the image is now sharper.
Smart HDR on the iPhone XS and XS Max does a far better job at exposing an image, which decreases the local contrast and results in an image looking smoother than it otherwise would.
RAW shots don't get noise reduction applied either, and with the added noise from the higher ISO, this makes for extremely poor looking images. This means when shooting RAW, you must shoot in manual and purposefully underexpose the image.

Sample RAW photos from iPhone X (left) and iPhone XS Max. | Source: Sebastiaan de With
Apple's apparent camera app decisions affect third-party apps -- such as Halide -- which is why de With says they have been working on an upcoming feature called Smart RAW. It uses a bit of their own computational magic to get more detail out of RAW photos while reducing noise. This new feature will be included in a forthcoming update to Halide.
While a lot of the analysis sounds critical, issues experienced at the hand of Apple's algorithms are so far outliers. Most selfies are taken in very unflattering light, and without the changes Apple has made, they would be poorly exposed and full of noise. Apple has likely erred on the side of caution by over-removing the noise and creating too-smooth images, but this can be pulled back.
More importantly, Apple has the ability to tinker with its firmware to solve the problems through subsequent updates if it so chooses.
As we saw in our recent iPhone XS vs X photo comparison, the iPhone XS and XS Max with Smart HDR have significantly improved photo taking capabilities, and the vast majority of users are already seeing the benefits.

iPhone XS and XS Max
In the extensive deep dive, de With outlines why the new iPhone's images look so vastly different to those taken by the iPhone X, even though much of the hardware was carried over from last year. Notably, the developer offers an explanation for an alleged skin-smoothing effect that made headlines over the weekend.
Computational photography is the future
Apple long ago realized that iPhone's future is not necessarily in its hardware. That is why the company generally shies away from publicizing "tech specs" such as processor speed or RAM, and instead focuses on real-world performance.The same holds true for the camera. Instead of simply cramming more pixels into its cameras each year, Apple focuses on improving existing components and, importantly, the software that drives them. As noted by de With, pure physics is quickly becoming the main obstacle as to how far Apple can take its pocket-sized camera platform. To push the art further, the company is increasingly reliant on its software making chops.

Halide photo app
It is this software that has made such a substantial difference for iPhone XS.
SVP of worldwide marketing Phil Schiller during last month's iPhone unveiling discussed the lengths to which the Apple-designed image signal processor (ISP) goes when processing a photo.
With iPhone XS, the camera app starts taking photos as soon as you open the app, before a user even presses the shutter button. Once a photo is snapped, the handset gathers a series of separate images, from underexposed and overexposed frames to those captured at fast shutter speeds. An image selection process then chooses the best candidate frames and combines them together to create an incredibly high dynamic range photo while retaining detail. The process clearly has some unintended consequences.
Bring the noise
To back up all that computational photography, the iPhone XS' camera app needs to take a lot of photos, very fast.In service of this goal, the new camera favors a quicker shutter speed and higher ISO. When the shutter speed is increased, less light is captured by the sensor. To compensate for the decrease in light, the camera app increases the ISO, which decides how sensitive the sensor is to light. So less light comes in, but the camera is more sensitive to light and is able to output a properly exposed image.

Sample RAW photos from iPhone X (left top and bottom) and iPhone XS RAW. | Source: Sebastiaan de With
However, as the ISO increases, the more noise starts to appear in the photo. That noise needs to be removed somehow, which is where Apple's Smart HDR and computational photography comes in to play.
More noise reduction leads to a slightly more smooth-looking image. This is part of the issue seen in the selfie-smoothing "problem" people have been reporting.
The second reason has to do with contrast.
Less local contrast
Local contrast is what most people recognize as sharpness in a photo."Put simply, a dark or light outline adjacent to a contrasting light or a dark shape. That local contrast is what makes things look sharp," de With says.
When a sharpening effect is applied to a photo, no details are actually being added, but instead the light and dark edges are boosted, creating more contrast and thus the illusion that the image is now sharper.
Smart HDR on the iPhone XS and XS Max does a far better job at exposing an image, which decreases the local contrast and results in an image looking smoother than it otherwise would.
Cameras for the masses
The biggest issues with this method of photography arise when shooting RAW. De With hypothesizes that the iPhone XS still prioritizes shorter exposure times and higher ISO to get the best Smart HDR photo, even when shooting RAW -- which gets no benefit from Smart HDR. This leaves overexposed and sometimes blown out photos with details being clipped and unreclaimable.RAW shots don't get noise reduction applied either, and with the added noise from the higher ISO, this makes for extremely poor looking images. This means when shooting RAW, you must shoot in manual and purposefully underexpose the image.

Sample RAW photos from iPhone X (left) and iPhone XS Max. | Source: Sebastiaan de With
Apple's apparent camera app decisions affect third-party apps -- such as Halide -- which is why de With says they have been working on an upcoming feature called Smart RAW. It uses a bit of their own computational magic to get more detail out of RAW photos while reducing noise. This new feature will be included in a forthcoming update to Halide.
While a lot of the analysis sounds critical, issues experienced at the hand of Apple's algorithms are so far outliers. Most selfies are taken in very unflattering light, and without the changes Apple has made, they would be poorly exposed and full of noise. Apple has likely erred on the side of caution by over-removing the noise and creating too-smooth images, but this can be pulled back.
More importantly, Apple has the ability to tinker with its firmware to solve the problems through subsequent updates if it so chooses.
As we saw in our recent iPhone XS vs X photo comparison, the iPhone XS and XS Max with Smart HDR have significantly improved photo taking capabilities, and the vast majority of users are already seeing the benefits.



Comments
This video demonstrates the issue perfectly
Be interesting to see what dxomark make of the new IPhone's, as the premium android phones have outshone the iphone camera for a while now, iPhone X in 8th place.
https://www.dxomark.com/category/mobile-reviews/
Photography expert explains iPhone XS 'beautygate,'
I don't see how this explanation debunks the problem
the fact is that the iPhone takes great pictures at a higher percentage than any other smartphone, and has for a long time.
when I take photographs, I try to mitigate the problems that may occur in difficult situations. That’s what pros, and knowledgeable amateurs, do. The difficulty with smartphone cameras is that because of the tiny cameras, what may not be an edge condition for a large camera, may be one for a smartphone. Each company deals with these problems differently. Samsung, for example, takes sharp pictures, but only when in high light conditions. And even then, looking at grass and foliage will show hopelessly smeary images there. Apple’s is better. Not quite as sharp in some areas, but not nearly as smeared in others. The Pixel can take some great pictures too, but it really blows out highlights, and even with the software fix, has some of the worst flare I’ve ever seen. But again, flare can be easily mitigated. Just don’t point your camera to the sun. Even $6,000 lenses can have a lot of flare.
I'm seeing some incredibly good pictures from professionals on the web, taken with Apple’s new cameras, and that’s what matters to me, because it shows what they are capable of when used by people who know what they’re doing.
As a photographer myself I see no fault. When I use my thousands of dollars camera I use the noise reducer to soften faces or places and people love that. I think what is pissing people off is the fact that they CAN'T CONTROL the feature.
People are so used to add filters to their faces and places that I imagine half the people complaining are pissed cause is no way to manipulate the way this phone take pictures. I suggest something, invest in some decent apps which do so and shut up about the "beautygate"!
This is a very good article and appreciate the fact that FINALLY someone took the time to try to explain or "debunk" this trend.
People REALLY have too much time in their hands to look for faults the phone may or may not have. You buy a NEW piece of technology, you have to deal with the issues of the NEW features. And Apple being the company it is, will fix them. They don't charge you a pretty penny for nothing.
I got a DOA phone, and they didn't even questioned me, they replaced it on the spot.
Yes, I spend a lot of money on Apple because I am deeply invested but I don't regret it. I've had a couple of Samsungs break on me a day after the one year warranty expired and what they said to me was "Oh well, your warranty expired, too bad!" Not in those words but that's basically what I heard.
This phone is great and probably next year's will be even better. That's what companies do and how they keep their customers happy.
As a photographer, i am always hoping for more manual control. I think smart HDR is great tho. And to those complaining, you can always go into settings and turn it off. I also think the computational DOF control in portrait mode is Amazing. Purists may scoff, but the background blur has really been improved. And it will keep on being improved. The fact that you can apply it live...or more incredibly after you take a photo blows my mind. I hope Apple keeps giving us manual controls. Like for shutter speed etc. in the meantime, there are other 3rd party photo apps that can provide those controls.
What excites me as a photographer is that the “s” iphone updates are usually minor ones. I can’t help think what improvements to the camera and smartphone photography Apple will bring to next years iphone. Like a true depth camera system on the back cameras for even more precise DOF control. Until then, happy shooting!
so I really think that what Apple, and other smartphone companies are doing, even with these early teething problems, are significant advances. We just need to give it a bit of time, and not be so critical right now. The pluses are greater than the minuses, even now.
And I appreciate AI interviewing someone who has expertise in photography to discuss what the Xs camera software is doing.
I am much less impressed by some people pulling in YouTube channels which make money by trashing Apple products.
YT channels which get views by bashing Apple are extremely biased in the selection of content and are worthless imo in comparing the performance of tech across different OEMs.
This unbox idiot is the same iKnockoff Knight who lied about iPhone XR's resolution.
Anyone who references a youtuber as a source should be laughed at. lol
"Beautygate".... this anti-Apple propaganda is getting worse every year.
https://m.imore.com/beautygate
Nice report, really. Beyond noise reduction and picking the sharpest buffered frame, imo it does not explains why skin tones and white balance change as soon as a face is detected during the capture. This happens either capturing photos from the front camera, the rear camera or even shooting a video. Just try to cover your face with your hand during a selfie and pay attention how your skin color changes. Such white balance and wierd skin tones even happen whether SmartHDR is active or not. It happens even if manual HDR is active or not. So iOS 12 is doing something in the background by default. And yes, iOS 12.1 still does. Looking at the pictures some time after, the orangy faces could look kind of normal, but it’s just not fine to me, far enough from real colors that causes kind of uncanny valley effect. Why should HDR vary white balance? Why specially noticeable in faces? This effect imo is contributing to the porcelain effect complained for some users.
Furthermore I noticed erratic global white balance during camera preview that I didn’t observed in previous models, at least with 5s and 6s that I own, which resulted in not few photos during last days with evident magenta-biased coloration. Curiously I found the same issue referred by just a single post in the web during last weeks. [SPA] https://www.xataka.com/analisis/iphone-xr-analisis-caracteristicas-precio-especificaciones
BTW I have the XR model.
I hope Apple rectifies this wierd camera issues. To me it is simply unacceptable for the price of any of the X-family devices, to such an extent that I’m seriously thinking on going for an older model and save some bucks.
Cheers
Further reading: https://www.cultofmac.com/571739/iphone-camera-white-balance/