Macs and pixel density

Posted:
in Future Apple Hardware edited January 2014
I would like to start a discussion on the topic of resolution on Apple displays. Personally I think that Apple's line of computers have screens with too low pixel density. 1400x900 is, in my mind, not satisfactory on a 17-inch laptop. On the other hand there is Dell selling 15-inch laptops with a 1600x1200 resolution. This is equally wrong the other way around. There must be middle ground to find that allows for higher resolution displays without making them almost unreadable (like Dell does). I think that something must be done at Apple. What do you think?
«1

Comments

  • Reply 1 of 35
    amorphamorph Posts: 7,112member
    Whether a resolution is "too high" depends on how good your eyesight is. We have a bunch of those 1600x1200 15" Dells, and every single person using them runs them at 1024x768, many with large fonts enabled as well.



    I'd say, given the general state of eyesight at least in the US, that Apple is pushing it with the current PowerBooks (and they've long since abandoned WYSIWYG).



    One answer is for Apple to offer multiple screen options for each model, which complicates ordering, manufacturing and software design, but gives people the option of ordering a Squintronic(TM) display if they really want one.



    The other option is a resolution independent GUI, which is a good ways off, but which would allow Apple to crank the screen resolution as high as they pleased - and, for the first time, more would unambiguously be better.
  • Reply 2 of 35
    lucaluca Posts: 3,833member
    One time I used a Dell laptop with a 15.4" screen at 1920x1200. The sharpness was incredible, and it ran UT2003 perfectly at max settings and 1600x1200. But the text was tiny and hard to read. However, to make up for it, some of the UI elements were larger (in number of pixels), making close boxes, the taskbar, etc, all about the same physical size.



    Text on web pages was tough to read, and any text from an app that wasn't scaled up was also hard to read. But I sure wouldn't mind having a huge resolution on my screen. 1920x1200 might be a bit much, but 1680x1050 or something similar wouldn't be all that bad.



    EDIT: Just to be clearer... I think maybe the 17" should get a 1680x1050 resolution , and maybe the 15" could get a 1440x900, while the 14" iBook goes to 1280x960 and the 12" iBook and PowerBook stay at 1024x768. Especially if you could do a systemwide UI and text zoom, so all text and UI elements on the screen are slightly larger. I mean, if MS can make a UI zoom feature, I'm sure Apple could make one 10x better.
  • Reply 3 of 35
    Quote:

    Originally posted by KANE

    I would like to start a discussion on the topic of resolution on Apple displays. Personally I think that Apple's line of computers have screens with too low pixel density. 1400x900 is, in my mind, not satisfactory on a 17-inch laptop. On the other hand there is Dell selling 15-inch laptops with a 1600x1200 resolution. This is equally wrong the other way around. There must be middle ground to find that allows for higher resolution displays without making them almost unreadable (like Dell does). I think that something must be done at Apple. What do you think?



    Could not agree more: I'd like to get a Powerbook with the same pixel density as the iBook. This would make the 15 inch about 1400x1000 and the 17" about 1600x1000. I think dell using 1900 pixels on an 17 incher is nuts, but Apple is way too low, I agree.



    Gimme more screen res and I'll replace my iBook ;-)
  • Reply 4 of 35
    shetlineshetline Posts: 4,695member
    The problem with pixel densities like you get with a 1600x1200 15" display is that it's not high enough a pixel density to do a good job treating the display as a resolution-independent device, yet the pixels are also too tiny for when bit-mapped graphics and font are mapped one-for-one, pixel by pixel.



    As far as I'm concerned, the whole range of pixel densities from about 120 dpi up through maybe 200 dpi or higher is pretty much useless. Get display resolutions up to 300 dpi, and users will be able to choose their own trade-off between information density and ease of readability, over a wide range of pleasingly crisp and sharp virtual resolutions.



    Since we don't have 300 dpi computer displays yet (well, at least not afforable ones), I'm much happier with Apple's choice of trying to stick with densities in the 90-100 dpi range. On the 12" PowerBook, I wouldn't mind seeing Apple push just a little further -- up to around 112 dpi -- just to give me some decent screen real estate without me having to auto-hide my Dock. (I really find auto-hide obnoxious!)
  • Reply 5 of 35
    thttht Posts: 3,062member
    Build-To-Order higher resolution LCD screens on notebooks and iMacs would satisfy everyone. The only question would be whether Apple's manufacturing prowess would be good enough to do it reasonably. Oh, I guess the real question would be why doesn't Apple offer BTO higher resolution screens?
  • Reply 6 of 35
    I agree about the BTO option. In the PC world I like the ability to spend $100 for a UXGA screen but go for the slower processor to balance it out.



    Apple's BTO options aren't that varied and maybe that's what helps them in their inventory. But we're talking about personalization. I would kill for an ultra hi-resolution on my laptop. I already have the fonts and icons at their smallest sizes and would just love more real estate without compromising physical size.
  • Reply 7 of 35
    yevgenyyevgeny Posts: 1,148member
    Quote:

    Originally posted by Amorph

    Whether a resolution is "too high" depends on how good your eyesight is. We have a bunch of those 1600x1200 15" Dells, and every single person using them runs them at 1024x768, many with large fonts enabled as well.



    I run my 15 inch 1600x1200 Dell at its max resolution. I do have glasses, but they are a very light prescription (I can view my dell well even without glasses, I just don't do so for ten hours a day). I don't use the big fonts for anything because they look completely silly and they would probably help me to see where my GUI's aren't fully compliant with the Win32 programming spec (I will let the QA people tell me where my GUI errors are- I despise MFC programming).



    I would agree that Apple's pixel densities are too low. I understand why they are doing it, but screen real estate seems to be in demand under OS X. I would rather offer 1280x1024 and give the user the chance to lower the resolution in software. OS X does a good job of making larger fonts look good than XP.
  • Reply 8 of 35
    yevgenyyevgeny Posts: 1,148member
    Quote:

    Originally posted by Amorph

    The other option is a resolution independent GUI, which is a good ways off, but which would allow Apple to crank the screen resolution as high as they pleased - and, for the first time, more would unambiguously be better.



    OS X is much closer to this than XP is. Both OS X and Xp do a good job of font smooting.
  • Reply 9 of 35
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by Yevgeny

    OS X is much closer to this than XP is. Both OS X and Xp do a good job of font smooting.



    But Aqua is absurdly fond of bitmaps, and Interface Builder (along with the Classic/Carbon GUI building tools) assumes absolute size and positioning. You couldn't go fully resolution independent without breaking things all over the place, or simply mapping large portions of the GUI to virtual pixels - which looks like crap on most LCDs (it's the same as downsampling to a lower resolution).



    When the GUI is vector-drawn and the widget classes and layout look more like Tk than Toolbox, then we'll really be there.
  • Reply 10 of 35
    shetlineshetline Posts: 4,695member
    Quote:

    Originally posted by Yevgeny

    I would rather offer 1280x1024 and give the user the chance to lower the resolution in software.



    The problem with this option is that a native 1280x1024 display pretending to be a 1024x768 display looks awful. Non-bitmapped fonts will do okay, but think about what happens to a nice, crisp bit map that would have been displayed pixel-for-pixel over a width of 200 pixels when it's stretched over 250 pixels, every 4x4 block of pixels having to somehow become a 5x5 block.



    Sure, the extra screen real estate is nice. But for the user who finds a given native resolution too small, you're only offering two choices: sharp but too small, or bigger and blurry.
  • Reply 11 of 35
    airslufairsluf Posts: 1,861member
    Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.



    Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.



    Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
  • Reply 12 of 35
    programmerprogrammer Posts: 3,409member
    Quote:

    Originally posted by Amorph

    But Aqua is absurdly fond of bitmaps, and Interface Builder (along with the Classic/Carbon GUI building tools) assumes absolute size and positioning. You couldn't go fully resolution independent without breaking things all over the place, or simply mapping large portions of the GUI to virtual pixels - which looks like crap on most LCDs (it's the same as downsampling to a lower resolution).



    When the GUI is vector-drawn and the widget classes and layout look more like Tk than Toolbox, then we'll really be there.




    It shouldn't be that bad, actually -- if running Quartz Extreme and using the GPU's bilinear or anisotropic filtering then the bitmaps would scale reasonably well.
  • Reply 13 of 35
    ast3r3xast3r3x Posts: 5,012member
    Is there a reason that everything related to computers isn't vector? I mean images sure it is understandable, but everything else (including text) is made with probably a graphics application. Why not just have everything in vector style? Wouldn't things look and work better?



    (Probably a stupid question but it seems feasible to me)
  • Reply 14 of 35
    bartobarto Posts: 2,246member
    With vector it is understandable. It would be a pain (design and performance wise) to make Mac OS X's window controls and buttons vector. OpenGL, on the other hand...



    It is doable, the problem is the massive effort associated with a massive leap from bitmap and pixel based to vector, OpenGL and display size (cm) based interfaces. It's on the same scale as moving from ASCII to bitmap graphics. Maybe with 10.4



    Barto
  • Reply 15 of 35
    ast3r3xast3r3x Posts: 5,012member
    Quote:

    Originally posted by Barto

    With vector it is understandable. It would be a pain (design and performance wise) to make Mac OS X's window controls and buttons vector. OpenGL, on the other hand...



    It is doable, the problem is the massive effort associated with a massive leap from bitmap and pixel based to vector, OpenGL and display size (cm) based interfaces. It's on the same scale as moving from ASCII to bitmap graphics. Maybe with 10.4



    Barto




    I figured vector graphics wouldn't be too much for modern computers, but thinking about it maybe I'm wrong, considering resizing in PS...not like that is any slower then some of apples apps
  • Reply 16 of 35
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by Programmer

    It shouldn't be that bad, actually -- if running Quartz Extreme and using the GPU's bilinear or anisotropic filtering then the bitmaps would scale reasonably well.



    Yeah, but I don't like hacks, and Apple doesn't do "reasonably well" if they can help it. (viz. the hardware requirements for iChat AV's videoconferencing).
  • Reply 17 of 35
    smirclesmircle Posts: 1,035member
    Quote:

    Originally posted by ast3r3x

    Is there a reason that everything related to computers isn't vector?



    One problem is performance. As we all know, Apple had a performance problem with their Quartz window layer already because of transparancies, the software composer, superior font kerning control and aggressive antialiasing. So (at least I guess) they chose not to make the situation worse by moving to a full-vector UI. All the jellybeans would have to be recalculated from a set of vectors, lighting and shadows each time they are drawn to the screen. This would make Aqua impossible on any sub-Ghz machine, I guess.



    The second problem is the two legacy APIs, Carbon and Cocoa. Carbon applications think in screen pixels, so the draw a line from (0,0) to (100,100) which obviously does no longer work well if you do not know if said line is going to be 1cm or 1mm. Apple botched this issue *years* ago when they went from the fixed resolution displays to multiscan CRTs. Back in the really old days, you knew that your display had a resolution of 72 dots per inch, so there was a strict correllation between line length in pixels and line length on your monitor.

    Strangely, Apple just gave up in the years between '94 and '98 and never advanced their Toolbox API (which is now, by and large Carbon) to abstract from screen pixels.



    As Cocoa is more modern when it comes to screen coordinates, Cocoa applications could more easily be adapted to a resolution independent displays. Nevertheless, AFAIK Cocoa apps have no notion if they draw in cm, inch, mm or points.



    There is - on the long run - only one solution: Apple must move from a classic display model to one, where applications draw in a coordinate system with labeled axes. They then would draw from (0mm,0mm) to (1mm, 1mm) and the OS would have to know if this translates to 141 pixels length or 1410 pixels.

    They could salvage "old school" Carbon apps by moving them to "virtual pixels", Cocoa apps to a virtual coordinate system.

    The interesting thing is - this is just what OpenGL and therefore Quartz Extreme would allow without rewriting all of Quartz. Beats me why this feature is not in 10.3 already, my guess is that they had their hands full moving to 64 bit
  • Reply 18 of 35
    bartobarto Posts: 2,246member
    The last couple generations of graphics hardware would cope with it.



    "it" being the polygons, anti-aliasing, shading, lighting, textures and everything else. Quartz is already some basic work in Exposé.



    It's not so much the CPU with a next-gen interface as the graphics hardware. Which actually applies to most next-gen stuff in computing at the moment.



    Barto
  • Reply 19 of 35
    bartobarto Posts: 2,246member
    Quote:

    Originally posted by Smircle

    The second problem is the two legacy APIs, Carbon and Cocoa. Carbon applications think in screen pixels, so the draw a line from (0,0) to (100,100) which obviously does no longer work well if you do not know if said line is going to be 1cm or 1mm. Apple botched this issue *years* ago when they went from the fixed resolution displays to multiscan CRTs. Back in the really old days, you knew that your display had a resolution of 72 dots per inch, so there was a strict correllation between line length in pixels and line length on your monitor.

    Strangely, Apple just gave up in the years between '94 and '98 and never advanced their Toolbox API (which is now, by and large Carbon) to abstract from screen pixels.




    Apple Displays are now 100dpi (or 40px/cm) across the range. Apple would just have to assume that 1px == 0.25mm. Considering that Mac OS X, native applications and X11 applications are designed for 100dpi, it shouldn't look too weird. Just use the same co-ordinates system but change the units from 1px to 0.25mm, what the application doesn't know won't hurt in (in most cases). Graphics apps would have to lose their independent rendering system and just draw using the new system (otherwise it would look really fuzzy), can you think of anything else?



    Barto
  • Reply 20 of 35
    smirclesmircle Posts: 1,035member
    Quote:

    Originally posted by Barto

    Apple Displays are now 100dpi (or 40px/cm) across the range.



    This can't be true if you take the books into account. The 12" has 1024x768, the 14" does too.





    Quote:

    Apple would just have to assume that 1px == 0.25mm. Considering that Mac OS X, native applications and X11 applications are designed for 100dpi, it shouldn't look too weird. Just use the same co-ordinates system but change the units from 1px to 0.25mm, what the application doesn't know won't hurt in (in most cases). Graphics apps would have to lose their independent rendering system and just draw using the new system (otherwise it would look really fuzzy), can you think of anything else?





    Yes. Pull through and modify Quartz Extreme to drop the notion of a screen pixel but treat all pixels as virtual pixels and simultanously offer a new, CoreGraphics based API for those applications that know about cm, inches and so on. Only this way can Apple hope to use 300DPI displays when they get cheaper.



    The technology is there - Quartz Extreme is using OpenGL which effectively uses virtual pixels but at the same time, QE ensures that there is a 1:1 relationship between virtual and screen pixels. Drop this constraint and you are there (provided the GPU knows how to do bicubic interpolation, else the result will look shitty).
Sign In or Register to comment.