So thats only 44 pixels better in the vertical and 160 horizontally
That seems very poor in sharpness with an extra 2 inchs taken into account - I saw the new 17 inch and thought the res was no where near as sharp as the 15 inch
I guess seeing will be believing in this case. May sound shitty on paper but seeing it may be otherwise!
PS, when I said printers, I should have said Printers -- the people not the peripherals.
I guess they like 72 for traditional reasons then, even though things have changed.
Interesting, knowing relatively little about the mac's history, besides the endless string of business side debacles, it's neat to note the tradition of logical, ergonomic choices for the print industry. Like matching the printer's (Periph not person) ppi to the screen dpi to make life easier for everyone.
Now, whatever the display may be doing, the printers we use (the peripherals not the people) do manage to make 12pt fonts the same whether we print them out on a 300dpi or 1200dpi machine. They must be able to do something that displays generally don't do??? no? Except with the odd graphic which sometimes comes out cockeyed from one printer to the next.
PS, when I said printers, I should have said Printers -- the people not the peripherals.
I guess they like 72 for traditional reasons then, even though things have changed.
Interesting, knowing relatively little about the mac's history, besides the endless string of business side debacles, it's neat to note the tradition of logical, ergonomic choices for the print industry. Like matching the printer's (Periph not person) ppi to the screen dpi to make life easier for everyone.
And also matching both pixels per inch and dots per inch to points, which are the measurement for typesetting and printing. Points are a traditional measurement, yes, but a quick glance at any word processor reveals that nothing has changed; points are stilll the standard for sizing fonts. What has changed (for the worse) is how accurately screens display point sizes.
Remember, 1 point is (almost) equal to 1/72 inch. So the Mac's screen resolution was chosen to make 12 point text look like 12 point text to someone working in publication, right there on the screen. If you took out a ruler and measured the text, it would come out (about) right.
Things have not changed for the better from this perspective. We have rarely, if ever, seen true WYSIWYG from Apple in a decade or so, and we won't again until screen resolution gets high enough that Apple can break the old 1 pixel = 1 point assumption without turning the UI into a giant, sluggish blur. When this happens, I imagine the print folk will be happy indeed.
Quote:
Now, whatever the display may be doing, the printers we use (the peripherals not the people) do manage to make 12pt fonts the same whether we print them out on a 300dpi or 1200dpi machine. They must be able to do something that displays generally don't do??? no? Except with the odd graphic which sometimes comes out cockeyed from one printer to the next.
Well, they take advantage of the fact that the resolutions of those devices are high enough to be able to reproduce increments of 1 point (~ 1/72 inch) — if not perfectly, then close enough as no matter (especially at 1200dpi). What's a thousandth of an inch between friends, right?
Graphics can get screwed up for any number of reasons that have nothing to do with resolution.
The trick here is that, as a performance compromise, a pixel has been defined to represent an absolute size, even though at this point their size is almost arbitrary. Compare printers, where a 1200dpi printer doesn't render text 16 times smaller than a 300 dpi printer; but the way things currently are, a 1200ppi monitor would render text 16 times smaller than a 300ppi monitor.
In other words, screen resolution — that is, the accuracy with which the screen represents what it's displaying — has never changed or improved. An 'a' in 12 point times is rendered with the same number of pixels on a 12" iBook's screen that it is on a 15" iMac's screen, and as a result the displayed size is divorced from the actual point size — if you print that 'a' out from your little iBook to a printer, the printer will produce a larger letter than you see on screen (the 15" iMac is much closer to WYSIWYG). What has to happen in order for displays with a high pixel density to be meaningful is that the old optimization that makes a screen pixel corresponds to some absolute size has to be abandoned, and screens have to become like printers, where a letter is a certain size (or a line a certain length and thickness) in absolute, measure-with-a-ruler terms, and the printer produces a facsimile as close to that absolute size as its resolution allows. The higher the resolution, the clearer and more accurate the facsimile; a screen inch would correspond to a ruler inch as closely as its resolution allowed. That's what "resolution" means, really. To return to the iBook and the iMac, the iBook could more accurately display a 12 point 'a' under this redefinition, because of its higher pixel density (resolution); but the iMac could display more of the page, because of its screen size in absolute terms.
Comments
Originally posted by kittylitterdesign
new powerbook 17 1440x900 resolution
old 15 inch 1280x854 resolution
So thats only 44 pixels better in the vertical and 160 horizontally
That seems very poor in sharpness with an extra 2 inchs taken into account - I saw the new 17 inch and thought the res was no where near as sharp as the 15 inch
I guess seeing will be believing in this case. May sound shitty on paper but seeing it may be otherwise!
Originally posted by kittylitterdesign
lets cut to the chase - the new 17inch powerbook screen is rubbish and does not live up to the hype...end of thread - thanks for calling!
May I?
SHUT UP!
O.M.G. somebody who actually understands the difference between DPI and PPI?
That's a first! I'm sick fed up of correcting people, DPI has become one of these casual terms "that'll do".
I guess they like 72 for traditional reasons then, even though things have changed.
Interesting, knowing relatively little about the mac's history, besides the endless string of business side debacles, it's neat to note the tradition of logical, ergonomic choices for the print industry. Like matching the printer's (Periph not person) ppi to the screen dpi to make life easier for everyone.
Now, whatever the display may be doing, the printers we use (the peripherals not the people) do manage to make 12pt fonts the same whether we print them out on a 300dpi or 1200dpi machine. They must be able to do something that displays generally don't do??? no? Except with the odd graphic which sometimes comes out cockeyed from one printer to the next.
Originally posted by Matsu
PS, when I said printers, I should have said Printers -- the people not the peripherals.
I guess they like 72 for traditional reasons then, even though things have changed.
Interesting, knowing relatively little about the mac's history, besides the endless string of business side debacles, it's neat to note the tradition of logical, ergonomic choices for the print industry. Like matching the printer's (Periph not person) ppi to the screen dpi to make life easier for everyone.
And also matching both pixels per inch and dots per inch to points, which are the measurement for typesetting and printing. Points are a traditional measurement, yes, but a quick glance at any word processor reveals that nothing has changed; points are stilll the standard for sizing fonts. What has changed (for the worse) is how accurately screens display point sizes.
Remember, 1 point is (almost) equal to 1/72 inch. So the Mac's screen resolution was chosen to make 12 point text look like 12 point text to someone working in publication, right there on the screen. If you took out a ruler and measured the text, it would come out (about) right.
Things have not changed for the better from this perspective. We have rarely, if ever, seen true WYSIWYG from Apple in a decade or so, and we won't again until screen resolution gets high enough that Apple can break the old 1 pixel = 1 point assumption without turning the UI into a giant, sluggish blur. When this happens, I imagine the print folk will be happy indeed.
Now, whatever the display may be doing, the printers we use (the peripherals not the people) do manage to make 12pt fonts the same whether we print them out on a 300dpi or 1200dpi machine. They must be able to do something that displays generally don't do??? no? Except with the odd graphic which sometimes comes out cockeyed from one printer to the next.
Well, they take advantage of the fact that the resolutions of those devices are high enough to be able to reproduce increments of 1 point (~ 1/72 inch) — if not perfectly, then close enough as no matter (especially at 1200dpi). What's a thousandth of an inch between friends, right?
Graphics can get screwed up for any number of reasons that have nothing to do with resolution.
The trick here is that, as a performance compromise, a pixel has been defined to represent an absolute size, even though at this point their size is almost arbitrary. Compare printers, where a 1200dpi printer doesn't render text 16 times smaller than a 300 dpi printer; but the way things currently are, a 1200ppi monitor would render text 16 times smaller than a 300ppi monitor.
In other words, screen resolution — that is, the accuracy with which the screen represents what it's displaying — has never changed or improved. An 'a' in 12 point times is rendered with the same number of pixels on a 12" iBook's screen that it is on a 15" iMac's screen, and as a result the displayed size is divorced from the actual point size — if you print that 'a' out from your little iBook to a printer, the printer will produce a larger letter than you see on screen (the 15" iMac is much closer to WYSIWYG). What has to happen in order for displays with a high pixel density to be meaningful is that the old optimization that makes a screen pixel corresponds to some absolute size has to be abandoned, and screens have to become like printers, where a letter is a certain size (or a line a certain length and thickness) in absolute, measure-with-a-ruler terms, and the printer produces a facsimile as close to that absolute size as its resolution allows. The higher the resolution, the clearer and more accurate the facsimile; a screen inch would correspond to a ruler inch as closely as its resolution allowed. That's what "resolution" means, really. To return to the iBook and the iMac, the iBook could more accurately display a 12 point 'a' under this redefinition, because of its higher pixel density (resolution); but the iMac could display more of the page, because of its screen size in absolute terms.