Resolution independence in Leopard confirmed by Apple

1235710

Comments

  • Reply 81 of 184
    melgrossmelgross Posts: 32,977member
    Quote:
    Originally Posted by shetline


    I suppose it all depends what you mean by "appreciate". Maybe we both appreciate different things.



    By appreciate, I mean the normal thing, being able to gain a meaningful advantage from it. 200 ppi does not do that.



    Quote:

    I'm not just talking about mere legibility, I'm talking about visual quality and visual character.



    I know what you mean. But print type quality is determined by more things than rez, which in print is considered to be high at 1,000 dpi. Nothing higher gives any greater quality. This has been agreed upon.



    but, to achieve that same quality, you would have us go to 1,000ppi? You seem to want to equal the quality, but that can't happen. There are also issues of contrast, etc.



    Quote:

    Obviously 12-point fonts are quite easy to read on a computer screen. No issues of basic legibility there. But compare text displayed using a 12-point font on a typical computer display to the same text, in the same font, at exactly the same physical size, printed in a moderately decent quality magazine. They don't look anything alike. The screen font is either chunkier looking than the print font, or blurrier looking if anti-aliasing is switched on, or a bit of both.



    While I admire your interest in quality, it doesn't really matter. At a certain level, it's enough. Wait until you see a display at 150ppi. you will be surprised at how sharp it appears.



    Quote:

    Get resolutions up to 200 dpi and that difference will be greatly diminished. Of course, in the print world, even 200 dpi is crap. But print resolution specifications often relate to binary pixels, pixels without any individual shading, each pixel being a spot where, in on/off fashion, a splash of ink or toner will or will not be applied. When you've got 256 shades for each R, G and B component of every pixel, 200 dpi can look pretty damned good.



    I don't think 150 dpi is quite there. What you really want is for the individual pixels to essentially disappear, to be just below threshold of perception so that scaling artifacts essentially disappear.



    At least you are not demanding the near term impossible requirement of 300ppi. 200 is doable, but at very high cost. The IBM was a small display?22" dia. a large monitor will cost a pretty penny even five years from now, probably even longer than that. 150 is quite doable, and will be fairly priced for a high quality display.
  • Reply 82 of 184
    melgrossmelgross Posts: 32,977member
    Quote:
    Originally Posted by shetline


    Huh!? What!?



    Current Photoshop... I zoom to 400%. Each pixel now occupies 16 pixels, an 4x4 square.



    Photoshop does not let me edit each of those 16 pixels. I can only edit the underlying single pixel that this big patch of pixels represents. They all change at once, no matter where inside the jumbo pixel I click. I suffer from no confusion or uncertainty about which pixel is being edited.



    Let's zoom to 450%. Now what do we do? We can either have a mix of 4x4, 4x5, 5x4, and 5x5 pixels on the display, or we can have a smoothed-out representation where every pixel's color is exactly represented by at least a 3x3 square of pixels, with edge display pixels being blended in varying proportions with the contributions of neighboring source-image pixels.



    So what? I'm not going to get confused by this and think I can edit those blended display pixels individually without changing the whole underlying source-image pixel, am I?



    Instead of talking in terms of percentages, however, let's imagine the OS is doing the scaling, not Photoshop, we have a 200 dpi display, and we tell the OS to display our image so that each pixel is 1/6" square. In Photoshop terms, this would be 3333.33% magnification, with each pixel mapping to a non-integer 33 1/3 pixels squared.



    Where's the problem? Each pixel's color gets exactly mapped to at least a 32x32 chunk of the display, with an essentially invisibly small single-pixel fringe of blended-color pixels. Where's the confusion?



    You are misunderstanding me.



    you are talking about a high rez monitor that has VERY small pixels, right?



    Ok, so when we view an image at 100% the image will appear to be very sharp. If it is a large file, it will take up a big part of even a big screen.



    But, you can't work those extremely small pixels. You can hardly see each one in isolation.



    So we magnify the image so that the pixel of the file takes up more than one screen pixel. That way we can see each image pixel well, and will have no problem selecting it, or doing whatever we have to. as you say 4 x 4 pixels on the screen is actually one image pixel. now, we can work with it.



    Pretty simple.



    The confusion is that when YOU tell PS to magnify, it simply does the doubling up or more of exactly what is there.



    When the OS will do it, it will interpolate. That's the difference.
  • Reply 83 of 184
    placeboplacebo Posts: 5,767member
    Another thing about uber-hirez displays is that you'll be able to turn antialiasing off.
  • Reply 84 of 184
    shetlineshetline Posts: 4,695member
    Quote:
    Originally Posted by melgross


    By appreciate, I mean the normal thing, being able to gain a meaningful advantage from it. 200 ppi does not do that.



    Here's a sample of Times New Roman, all 12-point italic, but rendered at 75 dpi (with and without anti-aliasing), and 150, 200, and 300 dpi, all anti-aliased. First I've shown them mapped pixel-to-pixel, then all scaled up to be the same physical size, using, respectively, 8x8, 4x4, 3x3, and 2x2 pixel blocks to represent each original pixel:







    It's pretty obvious 75 dpi sucks. I'll grant you that 150 is a great improvement. But the little extra 200 dpi gives is something I can appreciate, even when this stuff is scaled down to appropriate size. 300 dpi looks maybe even a little better, but there I'll grant you it's probably not worth the huge cost of nearly doubling video bandwidth and memory requirements, along with the more-than-doubled manufacturing costs.



    At any rate, here's very the same font sample image, but marked as 600 dpi instead of 72 dpi like the last one:







    Chances are no one's web browser is going to honor the dpi values and draw this image more than 8 times smaller than the first one, but I didn't want to post this second version as the only image just in case. If you print this image to a printer, honoring the 600 dpi resolution so that the printed image is at the proper physical size (which is tiny -- 0.833"x0.583"), you can judge for yourself if the jump from 150 to 200 is worthwhile.



    Please remember, however, that an important goal of resolution independence isn't just for high-quality screen fonts. There's a goal to essentially make pixels disappear, to be able to treat your video display as a seamless projection screen. 150 dpi is good, but I think it's noticeably shorter of attaining that goal than 200 dpi is.



    If you sit with your face approximately 18" from your screen (I think that's pretty typical), a 200 dpi pixel falls just under the magic size of one arc minute (1/60 of one degree), which is generally regarded as the limit of visual acuity for most humans. A 150 dpi pixel, while still damned small, doesn't quite reach that standard. Of course, just move your face back another 6", to 24" inches away from your display, and you'll get pretty much the same effect.



    Quote:
    Originally Posted by melgross


    I know what you mean. But print type quality is determined by more things than rez, which in print is considered to be high at 1,000 dpi. Nothing higher gives any greater quality. This has been agreed upon.



    but, to achieve that same quality, you would have us go to 1,000ppi? You seem to want to equal the quality, but that can't happen. There are also issues of contrast, etc.



    I brought up the issue of binary pixels vs. shaded pixel for this very reason. The only reason anything as high as 1000 dpi is ever needed is because you need all of those pixels to achieve shading that wouldn't otherwise be there. With 24-bit color, 256 discrete R/G/B levels built right into each pixel itself (no reliance on a large collection of pixels to achieve shading), and good anti-aliasing to take advantage of that pixel shading, 200 dpi is an excellent resolution, even by print standards.



    Quote:
    Originally Posted by melgross


    While I admire your interest in quality, it doesn't really matter. At a certain level, it's enough. Wait until you see a display at 150ppi. you will be surprised at how sharp it appears.



    I'm not disagreeing on the idea that there's a practical limit, I'm just disagreeing by about 50 dpi where that limit is. I have no doubt I'd find a 150 ppi display impressive, but I suspect I'd be able to detect aliasing artifacts which would practically disappear at 200 ppi.



    Quote:

    At least you are not demanding the near term impossible requirement of 300ppi. 200 is doable, but at very high cost.



    As with all things tech, prices will likely come down over time. Hell, someday 300 dpi could probably be dirt cheap too, but that's a way off, and a "why bother" thing anyway, like some of the insane audio resolutions you can do with the DVD Audio spec. There are encoding modes there which are certainly well beyond any human's perceptual capability -- although you'll never convince they guys with the $24,000 tube amps and the $1000/meter rhodium-plated, yak-fur insulated, oxygen-free copper speaker wires of that.
  • Reply 85 of 184
    shetlineshetline Posts: 4,695member
    Quote:
    Originally Posted by melgross


    You are misunderstanding me.



    I think I've figured out kinda sorta where you're coming from this time. We'll see.



    Quote:

    The confusion is that when YOU tell PS to magnify, it simply does the doubling up or more of exactly what is there.



    When the OS will do it, it will interpolate. That's the difference.



    I believe I see where you're missing what I'm saying. When I say Photoshop tells the OS to draw each pixel as a 1/6" square, I'm not saying that Photoshop hands of an entire image and tell the OS to scale it like it's a 6 dpi image. I'm saying that for each pixel, Photoshop literally instructs the OS to draw a square which is 1/6" on each side. Photoshop does not need to tell the OS to scale that pixel to an integer display pixel width and height in order to accomplish that, however.



    Suppose you have a 4x4 checkerboard of black and white pixels, and you've got a resolution independent display system. I'm guessing that you're imagining if Photoshop does anything other than directly address screen pixels, and dares go through the OS's resolution-independent API to do magnification, you'll get a mess like the one I show below on the left:







    I'm talking about something more like what I show above on the right, where there's no need to force scaling into some nice integer multiple of screen pixel to get a very usable result. If those pixels are really 150-200 dpi pixels, and you scale the way I'm showing on the right, Photoshop wouldn't need to know a damned thing about the display res (other than it's a "good" res), wouldn't need to address screen pixels directly, yet you'd have a result virtually indistinguishable from what you get with Photoshop today, except that the edges of your jumbo magnified pixels would look exceptionally crisp.
  • Reply 86 of 184
    melgrossmelgross Posts: 32,977member
    Quote:
    Originally Posted by shetline


    Here's a sample of Times New Roman, all 12-point italic, but rendered at 75 dpi (with and without anti-aliasing), and 150, 200, and 300 dpi, all anti-aliased. First I've shown them mapped pixel-to-pixel, then all scaled up to be the same physical size, using, respectively, 8x8, 4x4, 3x3, and 2x2 pixel blocks to represent each original pixel:







    It's pretty obvious 75 dpi sucks. I'll grant you that 150 is a great improvement. But the little extra 200 dpi gives is something I can appreciate, even when this stuff is scaled down to appropriate size. 300 dpi looks maybe even a little better, but there I'll grant you it's probably not worth the huge cost of nearly doubling video bandwidth and memory requirements, along with the more-than-doubled manufacturing costs.



    At any rate, here's very the same font sample image, but marked as 600 dpi instead of 72 dpi like the last one:







    Chances are no one's web browser is going to honor the dpi values and draw this image more than 8 times smaller than the first one, but I didn't want to post this second version as the only image just in case. If you print this image to a printer, honoring the 600 dpi resolution so that the printed image is at the proper physical size (which is tiny -- 0.833"x0.583"), you can judge for yourself if the jump from 150 to 200 is worthwhile.



    Please remember, however, that an important goal of resolution independence isn't just for high-quality screen fonts. There's a goal to essentially make pixels disappear, to be able to treat your video display as a seamless projection screen. 150 dpi is good, but I think it's noticeably shorter of attaining that goal than 200 dpi is.



    If you sit with your face approximately 18" from your screen (I think that's pretty typical), a 200 dpi pixel falls just under the magic size of one arc minute (1/60 of one degree), which is generally regarded as the limit of visual acuity for most humans. A 150 dpi pixel, while still damned small, doesn't quite reach that standard. Of course, just move your face back another 6", to 24" inches away from your display, and you'll get pretty much the same effect.



    First off, I'd like to applaud your work and effort that went into this.



    Secondly, I have agreed that 200ppi is "sharper" than 150. No argument there.



    It's unfortunate that one can't judge what an actual monitor at those rez's would look like from that work. You also can't judge what it will look like on a monitor by printing it out.



    That the truth, sad though it may be.



    I'd like to know what experience you've had with both 150ppi and 200ppi monitors yourself. Can you personally attest that you can see the difference, and in a way thay you would say that it makes a real difference, or are you going by the numbers alone?



    the short time I had the IBM monitor in my company, we could not see any advantage to it, and the $9,000 price, plus the special boards needed at the time just weren't worth it.



    Quote:

    I brought up the issue of binary pixels vs. shaded pixel for this very reason. The only reason anything as high as 1000 dpi is ever needed is because you need all of those pixels to achieve shading that wouldn't otherwise be there. With 24-bit color, 256 discrete R/G/B levels built right into each pixel itself (no reliance on a large collection of pixels to achieve shading), and good anti-aliasing to take advantage of that pixel shading, 200 dpi is an excellent resolution, even by print standards.



    You do release that type in print is rarely printed out as anything other than a direct image? That is, they aren't printed out as dots at all, but rather as a continuous image? Normally, only raster images are printed out as halftones.



    Quote:

    I'm not disagreeing on the idea that there's a practical limit, I'm just disagreeing by about 50 dpi where that limit is. I have no doubt I'd find a 150 ppi display impressive, but I suspect I'd be able to detect aliasing artifacts which would practically disappear at 200 ppi.



    You know that 6 point type is, if using the entire 6 points, which each letter does not, only a 12th inch high? that lower case letters are more like a 24th of an inch high?



    The likelihood that you would care about just how accurate it is, is almost nil. That's why we use greeking in the industry. Because no one really cares to look at such fine type. And even bigger letters look fine enough at 110ppi. At 150 they would be more than enough.



    And what you don't know is that the industry has been clamoring for 144dpi monitors for years. Why? Because an inch would be 144 points. so, each pixel would be exactly one half point.



    As most type is sized in one or one half point increments, normally, this would be WYSIWYG. The ideal.



    Quote:

    As with all things tech, prices will likely come down over time. Hell, someday 300 dpi could probably be dirt cheap too, but that's a way off, and a "why bother" thing anyway, like some of the insane audio resolutions you can do with the DVD Audio spec. There are encoding modes there which are certainly well beyond any human's perceptual capability -- although you'll never convince they guys with the $24,000 tube amps and the $1000/meter rhodium-plated, yak-fur insulated, oxygen-free copper speaker wires of that.



    I agree with this on all counts. I used to manufacture professional audio equipment, and some of what the industry is doing is preposterous.
  • Reply 87 of 184
    jeffdmjeffdm Posts: 12,949member
    I do agree that 150ppi is the next obvious step for desktop displays, at least, it's a lot smarter than jumping to 200, the 22" IBM & Viewsonic were simply too expensive and was discontinued pretty quickly. 150 is a lot nicer than 100, and by area, has a little over twice the pixels. Even 125 would be a good boost and probably a lot more sensible in terms of cost. I had 100ppi and 125ppi versions of the same notebook and there was a big difference between the two.
  • Reply 88 of 184
    shetlineshetline Posts: 4,695member
    Quote:
    Originally Posted by JeffDM


    150 is a lot nicer than 100, and by area, has a little over twice the pixels. Even 125 would be a good boost and probably a lot more sensible in terms of cost.



    I've seen 133 ppi displays, like a 15" Dell laptop with 1600x1200 res display. My sister had one of those. She found the native res made everything too small and squinty to use. So she ran the thing at 1024x768 (talk about coming down in the world!).



    When you do that, however, everything looks kind of soft-edged and out of focus. I realize that Windows doesn't do resolution-independent scaling of text or anything else, so I'm not seeing anything like the best possible (while still less squinty) use of such a display. Windows was rendering text as if rendering it to a 1024x768 native display, and then the video hardware was scaling pixels from a 1024x768 buffer up by a messy factor of 1.5625 so that they'd fill the native resolution of the display.



    Here's the thing, however: If 133 ppi was good enough to use in a resolution independent way, the scaled-up 1024x768 pixel data would have looked better and sharper than it did. The scaling artifacts weren't small enough, however, to lose that soft-edged fuzziness I was seeing.



    I'm not saying that 133 ppi isn't "nice" for some uses, or that 150 can't or won't "look good". I'm saying that those resolutions fall short of the point where you get the kind of seamless free-form scaling of text and images, to any virtual resolution you'd like, that makes resolution independence really shine.



    150 ppi (or, even better, the 144 ppi res melgross mentioned earlier) does have a nice scaling factor "sweet spot", however, because you can take material intended for a 72-dpi world and, by using an exact x2 scaling factor, get really nice results.



    133 ppi is kind of awkward that way. Since the pixels are still a bit too big at that res to really disappear from view as discrete entities, the "sweet spot" is again a factor of 2, but the results of that scaling are something that only someone who doesn't want bother putting on reading glasses is going to love -- a virtual 67 ppi "easy reader" mode that makes poor use of screen real estate. You can obviously use other scaling factors (like 1.5625), but the results are less than stunning.
  • Reply 89 of 184
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by shetline


    I realize that Windows doesn't do resolution-independent scaling of text or anything else,



    Windows has had resolution-independent text scaling since 95. It's not implemented well (a lot of apps don't work properly with it), but it's there.
  • Reply 90 of 184
    jeffdmjeffdm Posts: 12,949member
    Quote:
    Originally Posted by Chucker


    Windows has had resolution-independent text scaling since 95. It's not implemented well (a lot of apps don't work properly with it), but it's there.



    True. As far as I can tell from trying to use it for a month, it really only changes the fonts and a few related things, like the size of drop-down menus. It's a huge problem when even Microsoft's own apps and applets don't all properly scale up the window size to keep the text from disappearing off the edge of windows and dialogue boxes.
  • Reply 91 of 184
    shetlineshetline Posts: 4,695member
    Quote:
    Originally Posted by melgross


    It's unfortunate that one can't judge what an actual monitor at those rez's would look like from that work. You also can't judge what it will look like on a monitor by printing it out.



    A printout isn't a great means for visually testing this stuff, but it does help give you an idea of where your perceptual limits for small details are.



    Quote:
    Originally Posted by melgross


    I'd like to know what experience you've had with both 150ppi and 200ppi monitors yourself. Can you personally attest that you can see the difference, and in a way thay you would say that it makes a real difference, or are you going by the numbers alone?



    Beyond 133 ppi, I'm going by numbers alone. But 133 ppi looks far enough away from a good basis for doing really good resolution independence that it's hard to imagine the short hop up to 150 would make me happy.



    Quote:

    You do release that type in print is rarely printed out as anything other than a direct image? That is, they aren't printed out as dots at all, but rather as a continuous image? Normally, only raster images are printed out as halftones.



    Yes, I'm aware of this. In digital typography, however, there are still dots -- individual pixels -- at play, even if all of those dots are lumped together into contiguous large patches, rather than doing something like half-toning around the edges of character strokes.



    The thing is, however, with an idealized 1000-ppi output medium (everyone else seems to prefer "ppi" over "dpi", so I guess I'll switch -- no real difference except for cases where you can control the size of the individual dots) you could, if for some perverse reason you wanted to, randomize the positions of the pixels in any given 4x4 grid of pixels, so long as the pixel count in that 4x4 area remained the same, and the visual results for a human looking at text printed this way would essentially be the same (when viewed from 15" or greater away). We can't see details as small as 1000 ppi, so its the overall shading created by the number of dots is all that really matters (unless you pull out a magnifying glass or try to read something less than 4" from your face).



    Quote:
    Originally Posted by melgross


    You know that 6 point type is, if using the entire 6 points, which each letter does not, only a 12th inch high? that lower case letters are more like a 24th of an inch high?



    Don't get too hung up on just text rendering... nice screen fonts are only part of what I'd want out of a resolution-independent display. An important thing to remember is that a lot of low-res (72 ppi and thereabouts) graphics are out there in the world now and will be with us for a long time. You obviously are going to need to scale those up for most uses on hi-res displays. If you depend on simply doubling the size of such graphics to get good results, you aren't in a true resolution-independent world. Those graphics need to look as good as they possibly can look whether you're scaling them up by 105%, 190%, 224% or down to 88.6%, or whatever else you'd like.



    Extrapolating a bit from what I've seen with 133 ppi displays, I don't think 150 ppi is going to get you quite to that point. Going by the numbers, I believe that the magic number should be more like 200 ppi.
  • Reply 92 of 184
    jeffdmjeffdm Posts: 12,949member
    [QUOTE=shetline]Don't get too hung up on just text rendering... nice screen fonts are only part of what I'd want out of a resolution-independent display.[quote]



    For me, the UI elements are an issue. The Apple scroll bars and some other buttons, like the tab close button in Safari, are in my opinion too small to be useful. Others may want them smaller. Currently, there is no adjusting this that I am aware of.



    Quote:

    Extrapolating a bit from what I've seen with 133 ppi displays, I don't think 150 ppi is going to get you quite to that point. Going by the numbers, I believe that the magic number should be more like 200 ppi.



    The problem is cost effectiveness. 200 is too far ahead of the curve at the moment, for one, quad link DVI cards are pretty hard to find at best, expensive and the displays are pretty pricey too, and are currently only readily available second hand. Manufacturers will eventually make those displays cheaply enough, but those displays would be too expensive for the moment.
  • Reply 93 of 184
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by Placebo


    Another thing about uber-hirez displays is that you'll be able to turn antialiasing off.



    Not really, the problem is still there and edges still swim/jitter. Because everything else is sharper the reduced size of the swimming/jittering pixel isn't enough to make the problem go away.
  • Reply 94 of 184
    shetlineshetline Posts: 4,695member
    Quote:
    Originally Posted by JeffDM


    The problem is cost effectiveness. 200 is too far ahead of the curve at the moment, for one, quad link DVI cards are pretty hard to find at best, expensive and the displays are pretty pricey too, and are currently only readily available second hand.



    Hey, I'm just talking about what I think it will take for resolution independence to really work well. I can't help it if the only affordable solutions right now would do a half-assed job of it.
  • Reply 95 of 184
    hirohiro Posts: 2,663member
    Quote:
    Originally Posted by Chucker


    Windows has had resolution-independent text scaling since 95. It's not implemented well (a lot of apps don't work properly with it), but it's there.



    No. Windows has had the ability for the user to select an alternative point size for system fonts. OS 7 - OS 9 had that same ability. Oh, the screaming on the switch from Chigaco to Charcoal for the system font. But you could open a control panel and change it.



    Resolution independent fonts will always render a 12 point font at the same physical size, regardless of how many pixels it takes to do that. The old OS 7 and Windows solution was/is to allow a user to substitute a larger font to approximate the physical size in the screen that they wanted. I have a Dell laptop sitting to the left here with a 15" 1920 x 1200 widescreen display and have to manually set the system fonts to "Extra Large Fonts" to make them comfortably legible (and I have 20/15 vision!) That display is in the neighborhood of 140dpi and it is ridiculous without true resolution independence.
  • Reply 96 of 184
    shetlineshetline Posts: 4,695member
    Quote:
    Originally Posted by Placebo


    Another thing about uber-hirez displays is that you'll be able to turn antialiasing off.



    Why would you want to turn off anti-aliasing on a very hi-res display?



    At current common display resolutions, anti-aliasing is a bit of a mixed bag. It gives you a better sense of real font shapes and visual character, but at the expense of a terribly obvious and often annoying soft-focus effect. Get up to 200 ppi, or even 150 ppi, and I think you'll find that soft-focus effect goes away, and that anti-aliasing actually helps text look sharper and cleaner. To get the same results without any anti-aliasing, you'd need insane resolutions like 600 ppi or higher.
  • Reply 97 of 184
    chuckerchucker Posts: 5,089member
    Quote:
    Originally Posted by Hiro


    No. Windows has had the ability for the user to select an alternative point size for system fonts. OS 7 - OS 9 had that same ability.



    Er, no, Mac OS Classic had no ability in any way comparable to what Windows 95 has.
  • Reply 98 of 184
    shetlineshetline Posts: 4,695member
    Quote:
    Originally Posted by Hiro


    I have a Dell laptop sitting to the left here with a 15" 1920 x 1200 widescreen display and have to manually set the system fonts to "Extra Large Fonts" to make them comfortably legible (and I have 20/15 vision!) That display is in the neighborhood of 140dpi and it is ridiculous without true resolution independence.



    Can I take it that this is, more precisely, a 15.4" display? If so, the resolution is 147 ppi.



    How acceptable do you find the results if you abandon trying to use the full 1920x1200 resolution directly, and set the display to function at a lower virtual resolution like 1600x1000 or 1280x800?
  • Reply 99 of 184
    jasenj1jasenj1 Posts: 922member
    All this discussion of uber-high quality is nice. But what about people with just plain bad eyesight? The iMac and Mac mini are great for grandmas and grandpas - video iChat, iPhoto sharing, etc. Rez independence is great for those that subscribe to the large print edition of Reader's Digest.



    And then there's the "browse from the couch" crowd - 1920x1080 55" display on the wall attached to a Mac mini & iTV. Rez independence will be good for them, too.



    - Jasen.
  • Reply 100 of 184
    melgrossmelgross Posts: 32,977member
    Quote:
    Originally Posted by shetline


    A printout isn't a great means for visually testing this stuff, but it does help give you an idea of where your perceptual limits for small details are.





    Beyond 133 ppi, I'm going by numbers alone. But 133 ppi looks far enough away from a good basis for doing really good resolution independence that it's hard to imagine the short hop up to 150 would make me happy.





    Yes, I'm aware of this. In digital typography, however, there are still dots -- individual pixels -- at play, even if all of those dots are lumped together into contiguous large patches, rather than doing something like half-toning around the edges of character strokes.



    The thing is, however, with an idealized 1000-ppi output medium (everyone else seems to prefer "ppi" over "dpi", so I guess I'll switch -- no real difference except for cases where you can control the size of the individual dots) you could, if for some perverse reason you wanted to, randomize the positions of the pixels in any given 4x4 grid of pixels, so long as the pixel count in that 4x4 area remained the same, and the visual results for a human looking at text printed this way would essentially be the same (when viewed from 15" or greater away). We can't see details as small as 1000 ppi, so its the overall shading created by the number of dots is all that really matters (unless you pull out a magnifying glass or try to read something less than 4" from your face).



    ppi is for monitors, dpi is for print, and spi is for scanners. There are more, but those are the ones mostly used.



    Quote:

    Don't get too hung up on just text rendering... nice screen fonts are only part of what I'd want out of a resolution-independent display. An important thing to remember is that a lot of low-res (72 ppi and thereabouts) graphics are out there in the world now and will be with us for a long time. You obviously are going to need to scale those up for most uses on hi-res displays. If you depend on simply doubling the size of such graphics to get good results, you aren't in a true resolution-independent world. Those graphics need to look as good as they possibly can look whether you're scaling them up by 105%, 190%, 224% or down to 88.6%, or whatever else you'd like.



    Graphics look really good on my Sony. The higher the rez, the less of an improvement you will see.



    I'm sorry, but past 144, or 150 you just don't get the value that you seem to think you will. On a monitor, rather than a conceptualized ideal as you have been using, the difference is almost nonexistant.



    You really have to see that for yourself. It's seems to be impossible to explain it here.



    I'm very familliar with graphics. I've been doing this in one way or the other since 1969.



    Quote:

    Extrapolating a bit from what I've seen with 133 ppi displays, I don't think 150 ppi is going to get you quite to that point. Going by the numbers, I believe that the magic number should be more like 200 ppi.



    Try to find a higher rez monitor, and then come back and tell us what you think.
Sign In or Register to comment.