New 4K & 5K iMacs support 10-bit screen color for improved image accuracy

13

Comments

  • Reply 41 of 74
    polymniapolymnia Posts: 1,080member

    Does anyone besides me give a shit that after 7 years of Windows having this capability, MacOS is finally getting it?

     

    Is this really going to become just another excuse to bash each other over glossy vs matte?

     

    This 10 bit thing is pretty big for anyone who intends to calibrate their display.

     

    This guy has been complaining long and hard about this exact issue, and here are his thoughts about this development:

     

    http://macperformanceguide.com/blog/2015/20151030_1036-OSX_ElCapitan-10bit.html

     

    While I find this guy just as incorrigibly curmudgeonly as the worst characters here, he usually has a point.

  • Reply 42 of 74
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by polymnia View Post

     

    Does anyone besides me give a shit that after 7 years of Windows having this capability, MacOS is finally getting it?

     

    Is this really going to become just another excuse to bash each other over glossy vs matte?

     

    This 10 bit thing is pretty big for anyone who intends to calibrate their display.

     

    This guy has been complaining long and hard about this exact issue, and here are his thoughts about this development:

     

    http://macperformanceguide.com/blog/2015/20151030_1036-OSX_ElCapitan-10bit.html

     

    While I find this guy just as incorrigibly curmudgeonly as the worst characters here, he usually has a point.




    You're a little bit off there. The 2180WG may have come out in early 2006. No software or drivers supported 10 bit data paths at that time. Photoshop and a few other applications have supported it since around 2010-2011 with specific graphics cards. It requires certain features, and they have to be validated. It's not something that is easily just left up to the OS.

     

    Second it's not big for anyone who intends to calibrate their display. All you can do with any display offered by Apple is assign it a new profile. that is basically it, aside from possibly adjust the backlight. I'm not sure how Apple achieves that part. I haven't experienced great results running their displays at half brightness or lower, which makes me think it's a rather high level implementation. Some other displays do offer more in the way of lower level controls through a software interface. They can apply an LUT or matrix based profile that is stored in the display. You cannot do that with any Apple display. This just places your intensity values closer together. It has no impact on the accuracy of any color. It may help reduce banding. It may allow the use of panels with broader gamuts without introducing banding. It does not make the output of one display more reliable than another when gauging if something appears as intended.

     

     

    Quote:

    Originally Posted by wizard69 View Post





    Because a matte screen effectively remove resolution and sharpness. Why would Apple ship an incredibly high quality screen and then smear honey all over the screen? Seriously why? To keep a few idiots that don't understand the technology happy, I think not. I can honestly say that buying a matte screen was one of the worst things I've done in my years of purchasing technology. Anybody that actually thinks a matte screen is the right solution to a problem is grossly out of touch.

    They have improved considerably, but I wonder whether you ever sat in front of one of the earlier glossy imacs. They were useless in many lighting conditions. That aside you're used to LG's coatings, which aren't very good. There have been panels produced with exceptional matte coatings. Specifically Hitachi pioneered IPS technology, and their panels were really excellent. The matte coatings didn't give that annoying sparkly effect. It's too bad they're no longer in that market.

     

    Quote:
    Originally Posted by ophello View Post

     

    Just in case anyone is wondering...



    8-bit color: 2(8x3) = 16,777,216 unique colors

    10-bit color: 2(10x3) = 1,073,741,824 unique colors. That's a 64-fold increase.


    You should be able to use a broader gamut without banding, assuming it can genuinely reproduce 1024 levels per channel, or at least significantly more than 256. You should be able to expect something that doesn't rely heavily on dithering, although some displays with 10 bit panels still showed obvious signs of dithering. I can dig up the reviews if you like. Apple usually does an okay job on displays. They wouldn't always be my recommendation, but that doesn't matter.

     

    Just be careful not to misinterpret this, because color differences would be measured by euclidean distance with respect to a reference color model if they are intended for an audience with some background on the subject. They say things like 64 fold increase when they want to emphasize spec porn.

     

    Quote:
    Originally Posted by bitmod View Post

     



    Exactly!

    Professionals should stop being idiots and out of touch with technology - - and accept the troglodyte / cave dwelling lifestyle of zero light sources other than your display that Apple has mapped out for them.

     

    If your not capable of being a vampire mimic / xeroderma pigmentosum shunned aphotic recluse... you should probably use inferior products like all the other surface dwellers.




    It's not that atypical to minimize ambient light with a matte display. In fact there are ISO standards related to matching various things such as printed media and a display. They always specify a recommended ambient lighting level in nits that is quite dim. If you're doing anything that requires a really critical eye, you are best off working in darkness. Just make sure you have a stake available if you deem it necessary.

  • Reply 43 of 74
    elrothelroth Posts: 1,201member
    Quote:

    Originally Posted by Haggar View Post

     

     

    Before Apple started selling glossy displays, a lot of Mac users at the time were complaining about glossy displays on PC laptops.




    And some of them complained even louder when Apple started selling ONLY glossy displays - that was pretty ugly for a while. There were a bunch of people here at the time who were saying the complainers were stupid and crazy, and it was a new world, and everything would be glossy from then on.

     

    Then Apple started putting an anti-reflective coating on MacBook Pros, and then started making everything else anti-reflective. Apple even featured it as part of their advertising for a while. All the "glossy forever" people were pretty quiet then - the anti-reflective screens were so much better than the full glossy ones. (Anti-reflective is not the same as matte).

  • Reply 44 of 74
    Quote:

    Originally Posted by mstone View Post

     



    How would that be possible? We already can see every possible frequency of visible light from infrared to ultra violet. 


     

    You can see IR? Life must be weird.

  • Reply 45 of 74
    indyfxindyfx Posts: 321member
    Quote:
    Originally Posted by polymnia View Post

     

    Does anyone besides me give a shit that after 7 years of Windows having this capability, MacOS is finally getting it?

     

    Is this really going to become just another excuse to bash each other over glossy vs matte?

     

    This 10 bit thing is pretty big for anyone who intends to calibrate their display.

     

    This guy has been complaining long and hard about this exact issue, and here are his thoughts about this development:

     

    http://macperformanceguide.com/blog/2015/20151030_1036-OSX_ElCapitan-10bit.html

     

    While I find this guy just as incorrigibly curmudgeonly as the worst characters here, he usually has a point.


    We had MacOS running 10bit displays in shake (many) years ago (editing 16 bit image files). Second you don't seem to understand what 10 bit displays do or what color calibration actually does (or that the two are not mutually inclusive or exclusive) 10 bit displays prevent chroma and luma banding on the display (something that is easy to demonstrate (particularly in a luma ramp) and a wider gamut. It is NOT the same as editing 10 (or 16) bit image files, though it is useful when you are, I'll explain)

    if you are working with 8 bit files, this isn't really necessary because you have a "what you see is what you get" scenario. The problem occurs when you are editing deeper images, in that you may do some manipulation that causes banding. If you are looking at it on an 8 bit display you are blind to that (because everything looks banded) 

    This is the same argument I made regarding "matte" monitors being the only "pro" choice, first most (and when I say most I'm talking over 90%) PC monitors are just garbage, second just "matting" the screen causes (among other visual artifacts) luma bloom. Because everything you see on that ("matted") screen has bloom you can't tell when your source actually has bloom (or if it is just the monitor adding it.) This is unacceptable for nearly any "pro".

  • Reply 46 of 74
    polymniapolymnia Posts: 1,080member

    Quote:

    Originally Posted by IndyFX View Post

     

    We had MacOS running 10bit displays in shake (many) years ago (editing 16 bit image files). Second you don't seem to understand what 10 bit displays do or what color calibration actually does (or that the two are not mutually inclusive or exclusive) 10 bit displays prevent chroma and luma banding on the display (something that is easy to demonstrate (particularly in a luma ramp) and a wider gamut. It is NOT the same as editing 10 (or 16) bit image files, though it is useful when you are, I'll explain)

    if you are working with 8 bit files, this isn't really necessary because you have a "what you see is what you get" scenario. The problem occurs when you are editing deeper images, in that you may do some manipulation that causes banding. If you are looking at it on an 8 bit display you are blind to that (because everything looks banded) 

    This is the same argument I made regarding "matte" monitors being the only "pro" choice, first most (and when I say most I'm talking over 90%) PC monitors are just garbage, second just "matting" the screen causes (among other visual artifacts) luma bloom. Because everything you see on that ("matted") screen has bloom you can't tell when your source actually has bloom (or if it is just the monitor adding it.) This is unacceptable for nearly any "pro".




    I think I have a pretty solid understanding of color profiling. I understand that 8 or 10 bit makes little difference regarding the bit depth of the image in Photoshop. I'm not really sure what made you think I was conflating photoshop bit depth with color calibration and profiling. Like you say, more bits are better, generally. I understand that the 10 bit stream to the display is helpful for color profiling, yet hardly required.

     

    To back up a bit, the 'workflow' of a display is this, starting from the Mac and working up to the screen:


    1. Mac video system - This is where the graphics are rendered into the bitstream that is sent to Step 2. A software profile is applied at this point. This transforms the values of the colors. If all you do is software profile, this is the end of active color management

    2. The Bitstream between the Mac and Display - Bit depth has varied over the years, now we are discussing the jump from 8 bit to 10 bit per channel.

    3. Better displays have an LUT you can load the results of your color calibration & profiling into. This is where hardware (or Device) profiling lives.

    4. Display - Each display has a native gamut that can be achieved, and when color management is employed, transformations are applied in the above steps to match the defined input values.

     

    EXAMPLE 1: At each stage where there is a transformation, the number of available bits decreases. Oversimplified example: Imagine we are applying a transformation at Step 1 above. Let's say you have four possible color values (1-4), but when you measure the display of color value 2 it is actually closer to color value 3, you might transform color value 2 to color value 1. If color values 1, 3 & 4 measure as expected, your color profile would map the colors like this:


    1. Input value 1 = Output value 1

    2. Input value 2 = Output value 1

    3. Input value 3 = Output value 3

    4. Input value 4 = Output value 4

     

    Now you effectively have 3 output values possible with the profile applied. Let's say you display can 4 values as well. This is where banding is introduced. You cannot display a smooth gradient from 1 to 4 with the profile applied.

     

    EXAMPLE 2: Now lets say you have original artwork in a 4 value bit depth, you are editing in 4 value mode, but Step 2 above (the bitstream) can display 8 values rather than 4 and the display can output 8 values. Assuming the error in the measurement of value 2 isn't super extreme, given more possible values in Step 2 could allow the profile to assign output value 3, which is not the same as value 2. Obviously these correspond to super rough bit depths, and in a 8 to 10 bit workflow these differences would be much more subtle. But here is what you might end up with.


    1. Input value 1 = Output value 2

    2. Input value 2 = Output value 3

    3. Input value 3 = Output value 6

    4. Input value 4 = Output value 8

     

    All other things being equal, input value 1 & input value 4 in both examples would display the same color. Greater bit depth won't magically make for wider gamut on the display. What happens in between is what's important. With greater resolution in the Step 2 of the workflow input value 2 can be mapped to a different output value that input value 1. This allows better chance for each input color value to occupy it's own output value as transformations are applied during the color management process. Once two input values are forced to output at the same output value, information is lost which is destructive to the desired result of seeing all the details of the input artwork transformed into a defined and predictable image on screen.

     

    So if I am working in 8 bit mode, applying a Mac-hosted software color profile (which transforms the bits), outputting an 8 bit stream to either an 8 or 10 bit display, weather or not said display hosts its own hardware LUT, I am probably truncating that 8 bit stream, increasing the chances of banding or other problems.

     

    Rather than make more examples, I think it's worth pointing out that in digital workflows like this, the weakest link limits the rest of the system. Photoshop allows for quite deep bit depths which allows lots of headroom for editing transformations. Monitors with LUTs offer quite deep bit depths for the final output transformations. What sits in the middle is the Mac video system which has sat at 8 bit, and is the lowest resolution step in the process. Any transformation applied at this stage will take that 8 bit input and smash it into something effectively lesser, the degree of squishing depends an how radical the transformation required.

     

    I can point to a case study in my own office. I have a cheap Monoprice display (one of the 90% garbage displays you mention in your comment) that I (software, of course) calibrated, and even after calibrations (and adjusting monitor settings to get white point as close as possible prior to measurement), it is quite full of artifacts and generally disappointing. I also calibrated my iMac 27 inch display which is quite a bit higher quality to start with and the results are much more satisfactory. Seems to me this goes back to the more extreme transformations required to attempt to bring the cheap display into compliance. The more transformation required, the more color colors are squished into sharing output values.

     

    I may be wrong on some of this. I'm more of a color practitioner and much less a color scientist. I've done calibration & profiling and have experience with software-only versus hardware LUT solutions that experience makes me pretty confident in my understanding.

     

    Anyway, that's my dissertation for the day. I'm glad someone else is at least interested in to original topic of this article, though :)

  • Reply 47 of 74
    Quote:



    Originally Posted by Suddenly Newton View Post





    I'm hoping future spec bumps will give us more RGB colors than there are in nature.



    That is not possible even theoretically. What you said shows that you have no idea what color is. 

    Color is just a particular spectrum of EM waves of various lengths (or frequency, if you like). 

    You can't make something that is not possible in nature, because everything IS part of nature or the Universe. And it HAS already have all wavelengths present, or at least there is a possibility of them existing. The range on a EM wave frequency is [0 .. infinity] Hz and there is no limit on how complex a spectrum can be. All this is already included in the existing universe. How can one go out of that [0..inf] range is beyond my understanding.

    On top of that, you can only see a very VERY tiny portion of those wavelengths and even out of that, a bare human eye can't distinguish wavelength of yellow color from a spectrum of red wavelength + green wavelength, instead treating it as if they are identical. 



    So, if you consider all these, I don't really see why would you need "more RGB colors than there are in nature". 

  • Reply 48 of 74
    indyfxindyfx Posts: 321member
    Quote:
    Originally Posted by Anton Zuykov View Post

     



    That is not possible even theoretically. What you said shows that you have no idea what color is. 

    Color is just a particular spectrum of EM waves of various lengths (or frequency, if you like). 

    You can't make something that is not possible in nature, because everything IS part of nature or the Universe. And it HAS already have all wavelengths present, or at least there is a possibility of them existing. The range on a EM wave frequency is [0 .. infinity] Hz and there is no limit on how complex a spectrum can be. All this is already included in the existing universe. How can one go out of that [0..inf] range is beyond my understanding.

    On top of that, you can only see a very VERY tiny portion of those wavelengths and even out of that, a bare human eye can't distinguish wavelength of yellow color from a spectrum of red wavelength + green wavelength, instead treating it as if they are identical. 



    So, if you consider all these, I don't really see why would you need "more RGB colors than there are in nature". 




    Ok here is the deal: with 8 bits per channel  (RG&B each have 2^8 or 256 discrete steps between on and off) the eye can see discrete "jumps", (this can occur in luma or saturation (particularly those with chroma shift)  At 10 bits per channel (each color (RG&B) has 2^10  or 1024 discrete steps) your eye can no longer see the difference in the discrete steps.

    I don't know where all this infra red to ultraviolet BS is coming from, we are talking visible light and we are also talking what a monitor can produce (in multiple dimensions, also called it's gamut) 8 bits per channel can not produce enough steps that the human eye can't see the difference between them, with 10 bits it can (and the eye perceives a smooth ramp)

     

    Is it necessary? yes for work with extended bit depth image files it is. It has been around for many years (in proprietary workstations, windows and OS X ) but those systems were generally "pro only" as they required propritarary software, drivers, 10 bit video card and of course monitor.

    That was fine most non-pro's weren't working with high bit depth images anyway, now however things are different and many photographers are keeping (and editing) "raw" images (essentially  the raw data from the bayer filter in the camera but it is basically equivalent to a 12 or 14 bit RGB image file) so the new iMac's will have an advantage.

     

    This is a good thing folks (sorry to burst the apple haters bubble), it isn't irrelevant (or beyond human perception) and it is a definite asset to photographer who edit raw images and video editors (Apple's ProRes codecs have extended bit depth capability and many "prosumer" cameras now have the capability to shoot in extended depth or RAW formats so it is there if video editors/producers want it).

     

    -edited for typos-

  • Reply 49 of 74
    Quote:

    Originally Posted by IndyFX View Post

     

    I don't know where all this infra red to ultraviolet BS is coming from,


    Where did I say anything in my post regarding UV or IR?

     

  • Reply 50 of 74
    indyfxindyfx Posts: 321member
    Quote:
    Originally Posted by Anton Zuykov View Post

     

    Where did I say anything in my post regarding UV or IR?

     




    No you said "So, if you consider all these, I don't really see why would you need "more RGB colors than there are in nature". Others went down the infra red and ultra violet nonsense path.

     

    I wasn't trying to single you out, yours was just the last post, and I am done for the day so I decided to sit down and give the group the straight poop. 

  • Reply 51 of 74
    Quote:

    Originally Posted by IndyFX View Post

     

     it was the last post and the " I don't really see why would you need "more RGB colors than there are in nature" nonsense.


    nonsense, huh? ))

    Let me ask you one simple question. How well do you know physics, in particular, how well do you know what light is?



    I am asking this because "More RGB colors than in nature" statement and my attempt to show that it is incorrect have NOTHING to do with your long and unnecessary explanation on a problem on quantization. You missed a point by a mile.



    Regardless of how many bits you have for expressing colors, you will nether increase native color gamut of a device with more bits (color gamut doesn't depend on quantization), NOR will you be able to express more "More RGB colors than present in nature" with it because there is only limited number of electrons (with which RGB info will be sent) in the universe, hence at most, you will be able to have exactly the same number of colors that nature can provide and no more than that. 



    If something appears to be nonsense, it might be just your lack of knowledge or insight that prevents you from understanding.

  • Reply 52 of 74
    indyfxindyfx Posts: 321member
    Quote:
    Originally Posted by Anton Zuykov View Post

     

    nonsense, huh? ))

    Let me ask you one simple question. How well do you know physics, in particular, how well do you know what light is?



    I am asking this because "More RGB colors than in nature" statement and my attempt to show that it is incorrect have NOTHING to do with your long and unnecessary explanation on a problem on quantization. You missed a point by a mile.



    Regardless of how many bits you have for expressing colors, you will nether increase native color gamut of a device with more bits (color gamut doesn't depend on quantization), NOR will you be able to express more "More RGB colors than present in nature" with it because there is only limited number of electrons (with which RGB info will be sent) in the universe, hence at most, you will be able to have exactly the same number of colors that nature can provide and no more than that. 



    If something appears to be nonsense, it might be just your lack of knowledge or insight that prevents you from understanding.




    Im sorry I have inadvertently caused you great butthurt but the fact is 10 bit DOES make a visible difference even within a monitors limited gamut.

    And yes your "So, if you consider all these, I don't really see why would you need "more RGB colors than there are in nature" is just crap

  • Reply 53 of 74
    Quote:
    Originally Posted by IndyFX View Post

     



    Im sorry I have inadvertently caused you great butthurt but the fact is 10 bit DOES make a visible difference even within a monitors limited gamut.

    And yes your "So, if you consider all these, I don't really see why would you need "more RGB colors than there are in nature" is just crap




    It appears that I have prematurely asked you about physics and I apologize for that.

    Instead, I should have questioned your reading comprehension skills.



    Where did I say that 10 bits will not make a visible difference? I didn't say that.

    I stated only that you can't change native gamut of any device by simply supplying signal with more color information in it.

    The native max device gamut is determined by the manufacturing process and materials used and has nothing to do with quantization of the signal.



    "is just crap"

    Also notice that you made unsubstantiated claims and provided no factual support to it..

  • Reply 54 of 74
    indyfxindyfx Posts: 321member
    Quote:
    Originally Posted by Anton Zuykov View Post

     



    It appears that I have prematurely asked you about physics and I apologize for that.

    Instead, I should have questioned your reading comprehension skills.



    Where did I say that 10 bits will not make a visible difference? I didn't say that.

    I stated only that you can't change native gamut of any device by simply supplying signal with more color information in it.

    The native max device gamut is determined by the manufacturing process and materials used and has nothing to do with quantization of the signal.



     




    Alright... this is my last post because you seem to be talking in circles, the bit depth determines on the number of steps a monitors gamut is divided into. 8 bits is not enough Even with a monitors limited gamut (they can't produce all colors) a human can perceive "steps" in color gradients with 8 bit RGB input. With 10 bits the eye can no no longer detect the difference between those steps.

  • Reply 55 of 74
    Quote:

    Originally Posted by Anton Zuykov View Post

     



    That is not possible even theoretically. What you said shows that you have no idea what color is. 

    Color is just a particular spectrum of EM waves of various lengths (or frequency, if you like). 

    You can't make something that is not possible in nature, because everything IS part of nature or the Universe. And it HAS already have all wavelengths present, or at least there is a possibility of them existing. The range on a EM wave frequency is [0 .. infinity] Hz and there is no limit on how complex a spectrum can be. All this is already included in the existing universe. How can one go out of that [0..inf] range is beyond my understanding.

    On top of that, you can only see a very VERY tiny portion of those wavelengths and even out of that, a bare human eye can't distinguish wavelength of yellow color from a spectrum of red wavelength + green wavelength, instead treating it as if they are identical. 



    So, if you consider all these, I don't really see why would you need "more RGB colors than there are in nature". 




    My post was a joke about marketing hyperbole and spec chasing. Stop with the physics lesson and calling me ignorant.

  • Reply 56 of 74
    dacloodacloo Posts: 890member
    That is true but the "technology to human interpretation" doesn't work like that in a litteral sense. Even though the human brain/eye has limits in detecting 10 bit colors, it does make a difference. When you're looking at a photo you always focus on a specific area at the same time. Within that area you only see a subset of the photo's total color spectrum. There, a difference between an 8 or 10 bit image significantly matters.

    joelsalt wrote: »
    The human eye is estimated to be only able to see 10 000 000 colours anyway.  There are other reasons for going higher depth though, like alpha channels and stuff I don't understand.
  • Reply 57 of 74
    MarvinMarvin Posts: 15,309moderator
    indyfx wrote: »
    Ok here is the deal: with 8 bits per channel  (RG&B each have 2^8 or 256 discrete steps between on and off) the eye can see discrete "jumps", (this can occur in luma or saturation (particularly those with chroma shift)  At 10 bits per channel (each color (RG&B) has 2^10  or 1024 discrete steps) your eye can no longer see the difference in the discrete steps.

    Yeah, people should be able to see banding artifacts with 8-bit gradients (these aren't compression artifacts as it's a lossless image):

    1000

    With uniform colors like blue sky or green grass, the banding will be more noticeable, although some of it in photos is from compression:

    https://upload.wikimedia.org/wikipedia/commons/c/ce/Hutton_in_the_Forest_4K.jpg

    TVs are moving to HDR imagery so the authoring needed to improve for that to be worthwhile:

    http://www.flatpanelshd.com/focus.php?subaction=showfull&id=1435052975

    "Dolby says that we need 12-bit per channel (36 bit total) whereas the rest of the industry seems to think that 10-bit per channel (30-bit) is enough for now. As you probably know, today’s TVs typically use 8-bit per channel. If we really, really wanted to use the old EOTF gamma, developed for analog CRTs, we would need something like 14 or even 16-bit for HDR. That is impossible and unpractical."

    It's surprising that it's taken so long to get so little in the way of mainstream bit-depth improvements. 8-bit was clearly not the endpoint.
  • Reply 58 of 74
    polymniapolymnia Posts: 1,080member
    Marvin wrote: »
    Yeah, people should be able to see banding artifacts with 8-bit gradients (these aren't compression artifacts as it's a lossless image):

    1000

    With uniform colors like blue sky or green grass, the banding will be more noticeable, although some of it in photos is from compression:

    https://upload.wikimedia.org/wikipedia/commons/c/ce/Hutton_in_the_Forest_4K.jpg

    TVs are moving to HDR imagery so the authoring needed to improve for that to be worthwhile:

    http://www.flatpanelshd.com/focus.php?subaction=showfull&id=1435052975

    "Dolby says that we need 12-bit per channel (36 bit total) whereas the rest of the industry seems to think that 10-bit per channel (30-bit) is enough for now. As you probably know, today’s TVs typically use 8-bit per channel. If we really, really wanted to use the old EOTF gamma, developed for analog CRTs, we would need something like 14 or even 16-bit for HDR. That is impossible and unpractical."

    It's surprising that it's taken so long to get so little in the way of mainstream bit-depth improvements. 8-bit was clearly not the endpoint.

    Just playing devil's advocate here.

    The gradient you posted doesn't include any dithering, which is a pretty standard practice in image editing. In addition to optically masking the shortcomings of the limited bits available in the display system, I generally find a small bit of noise looks much more natural. Whenever I use a brush in Photoshop, I enable a bit of noise. On many adjustments, especially long smooth gradients, I add a 50% gray layer in Overlay mode and apply some Noise to it to introduce some dithering if the original artwork doesn't include it.

    Not that I disagree with the ideal that a smooth, noise-free gradient should render without visible banding, but there are techniques to mitigate this particular issue. And since it is only apparent in synthetic artwork (all video/photographic captures come with natural noise, for free!) it's easy to author such images that display nicely.
  • Reply 59 of 74
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by Marvin View Post





    Yeah, people should be able to see banding artifacts with 8-bit gradients (these aren't compression artifacts as it's a lossless image):







    With uniform colors like blue sky or green grass, the banding will be more noticeable, although some of it in photos is from compression:



    https://upload.wikimedia.org/wikipedia/commons/c/ce/Hutton_in_the_Forest_4K.jpg



    TVs are moving to HDR imagery so the authoring needed to improve for that to be worthwhile:



    http://www.flatpanelshd.com/focus.php?subaction=showfull&id=1435052975



    "Dolby says that we need 12-bit per channel (36 bit total) whereas the rest of the industry seems to think that 10-bit per channel (30-bit) is enough for now. As you probably know, today’s TVs typically use 8-bit per channel. If we really, really wanted to use the old EOTF gamma, developed for analog CRTs, we would need something like 14 or even 16-bit for HDR. That is impossible and unpractical."



    It's surprising that it's taken so long to get so little in the way of mainstream bit-depth improvements. 8-bit was clearly not the endpoint.

    None of these things are absolute. If you consider a discrete gamut of the device where each visually distinct coordinate is plotted with respect to some suitable reference color model (eg LAB although it's not perfect), the number of bits required for smoothness is at least somewhat correlated with the smallest difference between two unique coordinate values plotted with respect to the reference color model. I mentioned euclidean distance before, but there are more sophisticated formulas which are still used with existing color models.

     

    Anyway I suspect part of it is an issue of bandwidth. These are using serial connections, so they're streaming data. I don't think anyone claimed 10 bits was the end of everything. I actually claimed that its arbitrary unless the implementation makes good use of those bits. Otherwise it's spec porn. You may recall during the early 2000s consumer scanners advertised 48 bit color (spec porn) and higher end ones only advertised 32-36 bit color to avoid digitizing additional noise.

     

    Also 16 bit for HDR is entirely possible, but I suspect they would move to a different type of encoding. Note that HDR encodings tend to be gamma 1.0 rather than 2.2-2.4. They're also predominantly floating point types, so a number of those bits dictate the upper and lower bounds of the other bits. EXR uses 10 out of 16 bits for mantissa. The others are sign and exponent. They wouldn't use the same thing here though.

  • Reply 60 of 74
    polymniapolymnia Posts: 1,080member
    Quote:

    Originally Posted by Marvin View Post





    Yeah, people should be able to see banding artifacts with 8-bit gradients (these aren't compression artifacts as it's a lossless image):







    With uniform colors like blue sky or green grass, the banding will be more noticeable, although some of it in photos is from compression:



    https://upload.wikimedia.org/wikipedia/commons/c/ce/Hutton_in_the_Forest_4K.jpg



    TVs are moving to HDR imagery so the authoring needed to improve for that to be worthwhile:



    http://www.flatpanelshd.com/focus.php?subaction=showfull&id=1435052975



    "Dolby says that we need 12-bit per channel (36 bit total) whereas the rest of the industry seems to think that 10-bit per channel (30-bit) is enough for now. As you probably know, today’s TVs typically use 8-bit per channel. If we really, really wanted to use the old EOTF gamma, developed for analog CRTs, we would need something like 14 or even 16-bit for HDR. That is impossible and unpractical."



    It's surprising that it's taken so long to get so little in the way of mainstream bit-depth improvements. 8-bit was clearly not the endpoint.



    I made some samples based on your gradient image showing various amounts of dither to mitigate the banding. These techniques would be standard operating procedure if I were tasked with building a gradient like this.

     



    It's worth noting that the above inline image is NOT the original as saved out of Photoshop in both my & Marvin's posts. You need to go into the image gallery tool and click on the 'Original' link to get the actual, unprocessed original file.

Sign In or Register to comment.