Resolution Independence ? what are the implications?

Posted:
in Future Apple Hardware edited January 2014
If Leopard does indeed feature a resolution independent GUI, then what implications will this have on future hardware. I'm assuming that future displays will feature higher PPIs (like the iPhone) but what will this mean for future graphics cards? It's reasonable to expect that a new 23-ish" Cinema Display might have to adopt the Dual-Link interface that it's bigger 30" brother uses?



Or will Apple be forced to develop yet another proprietary video interface?

Comments

  • Reply 1 of 15
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by Messiah View Post


    Or will Apple be forced to develop yet another proprietary video interface?



    Apple will move to UDI; possibly even by the end of this year, but quite likely some time in 2008.
  • Reply 2 of 15
    kareliakarelia Posts: 525member
    Quote:
    Originally Posted by Messiah View Post


    Or will Apple be forced to develop yet another proprietary video interface?



    I don't care what you say, ADC is my favorite video interface of all time, there's just not enough monitors for it.
  • Reply 3 of 15
    Quote:
    Originally Posted by Karelia View Post


    I don't care what you say, ADC is my favorite video interface of all time, there's just not enough monitors for it.



    I concur.
  • Reply 4 of 15
    ianjohianjoh Posts: 4member
    Quote:
    Originally Posted by Chucker View Post


    Apple will move to UDI; possibly even by the end of this year, but quite likely some time in 2008.



    Intel and Samsung (the 2 biggest companies in the UDI SIG) officially announced they're going to fight HDMI with DisplayPort, rather than UDI. The Web's take is UDI is dead.



    Higher PPIs can be achieved with either video interface; you only need Dual DVI to run the 30" Apple Cinema Display!

    It displays 2560x1600 pixels - the same # of pixels on a 20" LCD = 150 ppi.
  • Reply 5 of 15
    hmurchisonhmurchison Posts: 12,425member
    Quote:
    Originally Posted by ianjoh View Post


    Intel and Samsung (the 2 biggest companies in the UDI SIG) officially announced they're going to fight HDMI with DisplayPort, rather than UDI. The Web's take is UDI is dead.



    Higher PPIs can be achieved with either video interface; you only need Dual DVI to run the 30" Apple Cinema Display!

    It displays 2560x1600 pixels - the same # of pixels on a 20" LCD = 150 ppi.



    Interesting. Intel and Samsung are still listed on the UDI website



    http://www.udiwg.org/home



    Unfreaking believable. I realize that connection standards are boring but the absolute dearth of reporting on Intel and Samsung moving to DisplayPort is astounding. I found virtually nothing on Google News. But then a basic web search brought up.



    http://www.vr-zone.com/index.php?i=4463



    http://www.vesa.org/press/DP_CES_PR_Finalz.htm



    Yup...UDI is dead. I wonder how they lost Intel and Samsung? Well this is one format battle that ended before it really began.



    For the record I think both technologies suck. Consumers have already generally been burned by HDMI. You wouldn't the believe the amount of problems I

    see on a daily basis regarding HDMI handshaking and other anomalies. Do I really think DisplayPort with it's heightened encryption is going to be better? Meh



    Quote:

    DisplayPort includes optional DPCP (DisplayPort Content Protection) copy-protection from Philips, which uses 128-bit AES encryption, with modern cryptography ciphers. It also features full authentication and session key establishment (each encryption session is independent). There is an independent revocation system. This portion of the standard is licensed separately.



    Shudder
  • Reply 6 of 15
    I think Apple will either stick with DVI or move on to HDMI. The former is a rock-solid video interface for high-resolution displays, but the latter is better tailored to media systems due to the carrage of audio along with video. DVI sucks if you want to connect your Mac to a TV as you have to run a video cable (DVI to whatever) and an audio cable (Optical or analog).



    I vote for HDMI across the board, especially considering you can connect an HDMI-out port to standard DVI (with an adapter), thus maintaining compatibility with all existing DVI displays.
  • Reply 7 of 15
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by ianjoh View Post


    Higher PPIs can be achieved with either video interface; you only need Dual DVI to run the 30" Apple Cinema Display!

    It displays 2560x1600 pixels - the same # of pixels on a 20" LCD = 150 ppi.



    That hypothetical 20" LCD would then need dual DVI in order to drive all those pixels.
  • Reply 8 of 15
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by Karelia View Post


    I don't care what you say, ADC is my favorite video interface of all time, there's just not enough monitors for it.



    It's a neat concept, but because no one else bothered with it, it was a pointless exercise. I don't think Apple should have made it without good third party support for it, now it's a relic that's hard to find adapters for, as such, that made me angry. I was lucky to find one a couple months ago so I could use my old PowerMac as my HTPC, my projector doesn't have any nonsense ADC. It was also limited in power delivery, it didn't take long for monitors to be too power hungry to justify keep using it.



    I've found an old Apple graphics card that had some sort of pre-DVI/ADC graphics port (with a Silicon Image chip no less), I wonder if anything ever used that.
  • Reply 9 of 15
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by kscherer View Post


    I vote for HDMI across the board, especially considering you can connect an HDMI-out port to standard DVI (with an adapter), thus maintaining compatibility with all existing DVI displays.



    HDMI might be fine for TVs, but for computer monitors, where it's more likely to be bumped or adjusted, I don't think is acceptable because there is no positive retention or locking mechanism to keep the cable from working its way out of the socket.
  • Reply 10 of 15
    pippinpippin Posts: 91member
    Apple had better hurry up and put out a 4k display...I mean I don't know how I'm going to do online edits of RAW Red One footage otherwise...
  • Reply 11 of 15
    I am a little confused as to why resolution independence would have major implications on future hardware?
  • Reply 12 of 15
    sthiedesthiede Posts: 307member
    will resolution independence help on lower resolution monitors as well on really high resolution monitors?
  • Reply 13 of 15
    Quote:
    Originally Posted by sthiede View Post


    will resolution independence help on lower resolution monitors as well on really high resolution monitors?



    I think that's the whole point of resolution independence. Negate the advantage (to a degree) the larger resolutions have over lower res screens.



    By measuring a window or menu's size by inches instead of pixels. So if you have two 15 inch screens, one at 640x480, and the other at 1280x1024, the windows would look exactly the same size instead of looking much larger on the 640x480 screen as it would with current screens.



    The only difference is that the desktop you're looking at would be noticeably sharper on the 1280x1024 screen because of it's higher PPI. Things would look to be the same size. Just more or less detailed than the opposing screen.



    Likewise, you could use a really high resolution screen and, because the windows are measured by physical distance, Things would not be extremely small as they would be with the current method.



    So to sum things up. You could "fit" just as much on a low resolution monitor as a higher resolution monitor without needing to up the resolution itself or needing a second monitor. But it wouldn't look quite as nice.
  • Reply 14 of 15
    sthiedesthiede Posts: 307member
    Quote:
    Originally Posted by smashbrosfan View Post


    I think that's the whole point of resolution independence. Negate the advantage (to a degree) the larger resolutions have over lower res screens.



    By measuring a window or menu's size by inches instead of pixels. So if you have two 15 inch screens, one at 640x480, and the other at 1280x1024, the windows would look exactly the same size instead of looking much larger on the 640x480 screen as it would with current screens.



    The only difference is that the desktop you're looking at would be noticeably sharper on the 1280x1024 screen because of it's higher PPI. Things would look to be the same size. Just more or less detailed than the opposing screen.



    Likewise, you could use a really high resolution screen and, because the windows are measured by physical distance, Things would not be extremely small as they would be with the current method.



    So to sum things up. You could "fit" just as much on a low resolution monitor as a higher resolution monitor without needing to up the resolution itself or needing a second monitor. But it wouldn't look quite as nice.



    well for me thats definetly good news, because right now im stuck with a 15" monitor with not too good resolution (1024 x 768) and i would like to have extra room to work when im at my desk on my macbook. thanks for helping me understand!
  • Reply 15 of 15
    messiahmessiah Posts: 1,689member
    Quote:
    Originally Posted by ThinkingDifferent View Post


    I am a little confused as to why resolution independence would have major implications on future hardware?



    More pixels to push around?
Sign In or Register to comment.