Photography expert debunks iPhone XS 'beautygate,' details Apple's software-driven camera ...

2»

Comments

  • Reply 21 of 22
    kevin kee said:
    valillon said:
    I’m re-posting here my recent comment in iMore.
    https://m.imore.com/beautygate

    Nice report, really. Beyond noise reduction and picking the sharpest buffered frame, imo it does not explains why skin tones and white balance change as soon as a face is detected during the capture. This happens either capturing photos from the front camera, the rear camera or even shooting a video. Just try to cover your face with your hand during a selfie and pay attention how your skin color changes. Such white balance and wierd skin tones even happen whether SmartHDR is active or not. It happens even if manual HDR is active or not. So iOS 12 is doing something in the background by default. And yes, iOS 12.1 still does. Looking at the pictures some time after, the orangy faces could look kind of normal, but it’s just not fine to me, far enough from real colors that causes kind of uncanny valley effect. Why should HDR vary white balance? Why specially noticeable in faces? This effect imo is contributing to the porcelain effect complained for some users.

    Furthermore I noticed erratic global white balance during camera preview that I didn’t observed in previous models, at least with 5s and 6s that I own, which resulted in not few photos during last days with evident magenta-biased coloration. Curiously I found the same issue referred by just a single post in the web during last weeks. [SPA] https://www.xataka.com/analisis/iphone-xr-analisis-caracteristicas-precio-especificaciones

    BTW I have the XR mode

    I hope Apple rectifies this wierd camera issues. To me it is simply unacceptable for the price of any of the X-family devices, to such an extent that I’m seriously thinking on going for an older model and save some bucks.

    Cheers

    Are you sure it's not just your screen because of truetone? Your iPhone feature sensors that measure the ambient light colour and brightness so it can correct white point and illumination based on your environmental lighting in order to render the right kinds of white under any conditions. Try turn it off and take the photo again.

    Further reading: https://www.cultofmac.com/571739/iphone-camera-white-balance/
    That crossed my mind too. However even if the iPhone's screen adapts its color temperature to the ambient light, that doesn't explain those erratic white balance changes. Light color temperature in a room or outdoors doesn't change so fast. I insist, it's very obvious that such color flickering happens when a face or a person comes into the scene. Again, no matter whether the Smart HDR is activated or not. Just try it. See a real example I just captured of myself. The right case (without detected face) not only is more reliable to the light that I saw in the room but also it looks sharper.






    Another issue is the magenta-bias. I load two examples more.
    The label under the monster mask below was completely white in real live.


    The bald man below didn't have magenta skin on top, neither the TWINSET sunblind.


    I'm aware auto white balance is a tough task. Computer vision algorithms simply fail in some cases since they require deep understanding of the scene beyond mere statistics of the pixels colors. Maybe AI will push AWB in a recent future. But in the meantime imo this SmartHDR at its current state is so unstable that bothers.


    gatorguy
     0Likes 0Dislikes 1Informative
  • Reply 22 of 22

    It’s been a while since iOS 12.1.2 has been released and still hoping Apple includes some camera fixes in its reluctant neural HDR camera system. Smart HDR has proved to resolve some challenging lightning situations quite brilliantly. Now Apple is world-wide promoting astonishing pictures from their new line of iPhones, a new campaign come out recently for Christmas. However nothing it’s being said since that beautygate issue jumped to the internet community few days after the Xphones were put on sale. It seems like a fine layer of makeup is being spread to make the issue vanish. That fine tuning fixed on iOS 12.1 is almost nothing compared to the severe facial artifacts generated by the smartHDR. HDR can cause flat contrast images, since this is what HDR is intended to perform. To the limit, HDR methods can produce very unrealistic images. I personally do not like those extreme HDR pictures, this is a matter of taste though. On the iPhones, a tiny contrast boost can solve the thing sometimes. What is disturbing to me is not the contrast however, but the reddish skin tone the smartHDR produces. In this post previously commented some coloring issues that the Xphones are suffering as soon as a face come into the scene. Now in this example my face is not only exposed very flat, but also the skin is so red that as soon as I look at the image it feels like it’s not me. It’s like I’ve been made-up or simply like I’ve been hours under the sun with no skin-protection at all. On the contrary with the previous auto HDR method some parts of the face are over exposed, but it doesn’t bother me so much since the skin tones are pretty much the same my skin is. Since HDR should deal with only exposure, my question is, why the heck smartHDR modifies, not only the local contrast, but also the color tones of the image, specially on faces?



    gatorguy
     0Likes 0Dislikes 1Informative
Sign In or Register to comment.