Chuck, Placebo, melgross a.o.: few Mac people seem to be aware of the DDC standard, which, among other things, requires the monitor to return the physical dimensions of the display area to the graphics card (this info is given in the time frame between subsequent screen refreshes). You should also distinguish display resolution from screen resolution (the maximum display resolution the screen can handle); and also, do not forgot that horizontal PPI is not necessarily equal to the vertical PPI.
In X11, some of this DDC info can be queried with "xdpyinfo". Curiously enough, X11 on my Mac returns made-up values for the screen dimensions, resulting in 75 PPI both horizontally and vertically (instead of the real PPI of 114 on my MacBook for instance). Perhaps this is done to preserve consistency with other Mac applications which assume 72 PPI.
To my knowledge, X11 has always had scaling factors that could be set by the user or automatically derived from xdpyinfo's output. The X11 font server is one service that implements resolution independence. On the application level, one example I can think of is an earlier version of the Mozilla browser, which allowed the PPI parameter to be set by the user to something else than 96.
And one last thing: resolution independence or UI scaling involves much more than just scaling graphics and letters: hinting, subpixel positioning, correct handling of rounding errors, bitmap caching. It is a non-trivial task to implement this while maintaining the high typographic/visual quality that we are used to see from Apple.
Chuck, Placebo, melgross a.o.: few Mac people seem to be aware of the DDC standard, which, among other things, requires the monitor to return the physical dimensions of the display area to the graphics card (this info is given in the time frame between subsequent screen refreshes).
In X11, some of this DDC info can be queried with "xdpyinfo". Curiously enough, X11 on my Mac returns made-up values for the screen dimensions, resulting in 75 PPI both horizontally and vertically (instead of the real PPI of 114 on my MacBook for instance). Perhaps this is done to preserve consistency with other Mac applications which assume 72 PPI.
On Mac OS X, X11 runs on top of Quartz, which is why your xdpyinfo isn't actually getting real hardware data.
Quote:
And one last thing: resolution independence or UI scaling involves much more than just scaling graphics and letters: hinting, subpixel positioning, correct handling of rounding errors, bitmap caching.
I did mention that. (EDID is what you mean.) [... "Actually, judging from Wikipedia's EDID page, the maximum viewable size is indeed given. Of course, it's a question of how many screens really support that.]
actually, all monitors that i have come across in the last years.
Quote:
On Mac OS X, X11 runs on top of Quartz, which is why your xdpyinfo isn't actually getting real hardware data.
so why is it 75 ppi then, instead of 72?
Quote:
All of which, AFAICT, area already implemented.
sure; i just wanted to stress that resolution independence is a BIG feature of Leopard; the idea is simple, but the implementation is difficult. especially if one has to provide a comfortable upgrade path for all those existing pixel-perfect applications.
Chuck, Placebo, melgross a.o.: few Mac people seem to be aware of the DDC standard, which, among other things, requires the monitor to return the physical dimensions of the display area to the graphics card (this info is given in the time frame between subsequent screen refreshes). You should also distinguish display resolution from screen resolution (the maximum display resolution the screen can handle); and also, do not forgot that horizontal PPI is not necessarily equal to the vertical PPI.
In X11, some of this DDC info can be queried with "xdpyinfo". Curiously enough, X11 on my Mac returns made-up values for the screen dimensions, resulting in 75 PPI both horizontally and vertically (instead of the real PPI of 114 on my MacBook for instance). Perhaps this is done to preserve consistency with other Mac applications which assume 72 PPI.
To my knowledge, X11 has always had scaling factors that could be set by the user or automatically derived from xdpyinfo's output. The X11 font server is one service that implements resolution independence. On the application level, one example I can think of is an earlier version of the Mozilla browser, which allowed the PPI parameter to be set by the user to something else than 96.
And one last thing: resolution independence or UI scaling involves much more than just scaling graphics and letters: hinting, subpixel positioning, correct handling of rounding errors, bitmap caching. It is a non-trivial task to implement this while maintaining the high typographic/visual quality that we are used to see from Apple.
As I said earlier, few devices are capable of using such information. It is a standard that is rarely implemented. When it is not, incorrect results are returned, as you found out.
Yes, you are correct about the difficulties that ensue when rez independence is attempted. this is why we have not seen it before. Too many things have to fall into place.
actually, all monitors that i have come across in the last years.
so why is it 75 ppi then, instead of 72?
It is not being implemented. Half of the system isn't sufficient.
Quote:
sure; i just wanted to stress that resolution independence is a BIG feature of Leopard; the idea is simple, but the implementation is difficult. especially if one has to provide a comfortable upgrade path for all those existing pixel-perfect applications.
We don't yet know if it is a feature that will be implemented for the user.
Heh heh. Don't forget that mass and weight are two different things.
If we really want to be accurate, the only place where the gram weighs a gram, accurate to just so many decimal places, is where is was determined in the first place. Move it a few miles, and the gravity is different. It will seem the same, because everything else will change along with it, but it really won't be. strain gauge scales will be able to tell, but most conventional scales won't.
That's why using atomic mass is more accurate. Eliminates the gravity field.
Definitely, I kind of figured they (BIPM, NIST, others) weren't weighing the artifacts (as these Kilogram prototypes are referred to as) due to just the effects (plus a WHOLE bunch of others!) you mentioned. I did find several very insightful PDF's by BIPM/NIST and you may be surprised at how they measure the mass while avoiding these effects.
They use a beam balance, which in principle is just like the ones we all used in high school!
Of course these beam balances are precision electro-mechanical-optical devices able to achieve accuracies of up to 1 part in a trillion (10^12), which to my surprise is still about 2 orders of magnitude higher than other current methods (atomic, etcetera). But if all goes according to plan, by 2011 or so they will be able to ditch these artifacts for other methods based on physically known constants.
To use the beam balance they use the the primary (plus several copies) and do weigh-offs with newly fabricated ones (i. e. when the balance is level the copy is good to go). Supersets of 1 Kg are fairly straight forward (in fact NIST makes them as large as 28,000 Kg!). Subsets are where they get a little more clever, for instance they step down to 1000+500+200+200+100 g unknowns and 2 1000 g knowns using a 6 step permutation sequence. And a similar procedure is used to obtain even smaller sizes, until you end up with a set of weights like the set of SS weights we all used in high school!
BTW, a metric tonne is just 1000 Kg of pure water which just happens to fit into a 1 meter cube, thus the original relationship between the Kilogram and Meter, and why rho = 1 g/cc.
PS - I know this is off topic, but I just couldn't resist, it's a subject area I've dealt with for over 30 years now (geometric, kinematic, and dynamic similitude AKA Froude scaling). I hope you'll excuse the interruption!
BTW, a metric tonne is just 1000 Kg of pure water which just happens to fit into a 1 meter cube, thus the original relationship between the Kilogram and Meter, and why rho = 1 g/cc.
There are a couple of screen shots showing more Leopard WWDC preview features on the Aero Experience forum, one of the images shows a semi complete vector UI!
There are a couple of screen shots showing more Leopard WWDC preview features on the Aero Experience forum, one of the images shows a semi complete vector UI!
Comments
the ppi (people, please stop saying dpi for monitors!)
Why? Isn't the distinction only relevant for unrelated fields?
Why? Isn't the distinction only relevant for unrelated fields?
There are distinctions.
Dpi is for print. Ppi is for displays, and spi is for scanning.
There's also lpi.
That's true. I forgot that one.
Wow, this has turned into a real pissing contest....
No it's not. It's an interesting discussion.
Wow, this has turned into a real pissing contest....
I don't think much arguing is going on.
When you write a custom print page then use your pt/cm layouts.
It's not hard to define fonts with em/ex relationships and let the browser scale accordingly.
When you write a custom print page then use your pt/cm layouts.
Yers, that could be done. It's a better idea than some I've heard.
In X11, some of this DDC info can be queried with "xdpyinfo". Curiously enough, X11 on my Mac returns made-up values for the screen dimensions, resulting in 75 PPI both horizontally and vertically (instead of the real PPI of 114 on my MacBook for instance). Perhaps this is done to preserve consistency with other Mac applications which assume 72 PPI.
To my knowledge, X11 has always had scaling factors that could be set by the user or automatically derived from xdpyinfo's output. The X11 font server is one service that implements resolution independence. On the application level, one example I can think of is an earlier version of the Mozilla browser, which allowed the PPI parameter to be set by the user to something else than 96.
And one last thing: resolution independence or UI scaling involves much more than just scaling graphics and letters: hinting, subpixel positioning, correct handling of rounding errors, bitmap caching. It is a non-trivial task to implement this while maintaining the high typographic/visual quality that we are used to see from Apple.
Chuck, Placebo, melgross a.o.: few Mac people seem to be aware of the DDC standard, which, among other things, requires the monitor to return the physical dimensions of the display area to the graphics card (this info is given in the time frame between subsequent screen refreshes).
I did mention that. (EDID is what you mean.)
In X11, some of this DDC info can be queried with "xdpyinfo". Curiously enough, X11 on my Mac returns made-up values for the screen dimensions, resulting in 75 PPI both horizontally and vertically (instead of the real PPI of 114 on my MacBook for instance). Perhaps this is done to preserve consistency with other Mac applications which assume 72 PPI.
On Mac OS X, X11 runs on top of Quartz, which is why your xdpyinfo isn't actually getting real hardware data.
And one last thing: resolution independence or UI scaling involves much more than just scaling graphics and letters: hinting, subpixel positioning, correct handling of rounding errors, bitmap caching.
All of which, AFAICT, area already implemented.
I did mention that. (EDID is what you mean.) [... "Actually, judging from Wikipedia's EDID page, the maximum viewable size is indeed given. Of course, it's a question of how many screens really support that.]
actually, all monitors that i have come across in the last years.
On Mac OS X, X11 runs on top of Quartz, which is why your xdpyinfo isn't actually getting real hardware data.
so why is it 75 ppi then, instead of 72?
All of which, AFAICT, area already implemented.
sure; i just wanted to stress that resolution independence is a BIG feature of Leopard; the idea is simple, but the implementation is difficult. especially if one has to provide a comfortable upgrade path for all those existing pixel-perfect applications.
I don't think much arguing is going on.
I guess "pissing" may not be the right word... but for me, it is a very educational pissing contest.
Chuck, Placebo, melgross a.o.: few Mac people seem to be aware of the DDC standard, which, among other things, requires the monitor to return the physical dimensions of the display area to the graphics card (this info is given in the time frame between subsequent screen refreshes). You should also distinguish display resolution from screen resolution (the maximum display resolution the screen can handle); and also, do not forgot that horizontal PPI is not necessarily equal to the vertical PPI.
In X11, some of this DDC info can be queried with "xdpyinfo". Curiously enough, X11 on my Mac returns made-up values for the screen dimensions, resulting in 75 PPI both horizontally and vertically (instead of the real PPI of 114 on my MacBook for instance). Perhaps this is done to preserve consistency with other Mac applications which assume 72 PPI.
To my knowledge, X11 has always had scaling factors that could be set by the user or automatically derived from xdpyinfo's output. The X11 font server is one service that implements resolution independence. On the application level, one example I can think of is an earlier version of the Mozilla browser, which allowed the PPI parameter to be set by the user to something else than 96.
And one last thing: resolution independence or UI scaling involves much more than just scaling graphics and letters: hinting, subpixel positioning, correct handling of rounding errors, bitmap caching. It is a non-trivial task to implement this while maintaining the high typographic/visual quality that we are used to see from Apple.
As I said earlier, few devices are capable of using such information. It is a standard that is rarely implemented. When it is not, incorrect results are returned, as you found out.
Yes, you are correct about the difficulties that ensue when rez independence is attempted. this is why we have not seen it before. Too many things have to fall into place.
actually, all monitors that i have come across in the last years.
so why is it 75 ppi then, instead of 72?
It is not being implemented. Half of the system isn't sufficient.
sure; i just wanted to stress that resolution independence is a BIG feature of Leopard; the idea is simple, but the implementation is difficult. especially if one has to provide a comfortable upgrade path for all those existing pixel-perfect applications.
We don't yet know if it is a feature that will be implemented for the user.
Heh heh. Don't forget that mass and weight are two different things.
If we really want to be accurate, the only place where the gram weighs a gram, accurate to just so many decimal places, is where is was determined in the first place. Move it a few miles, and the gravity is different. It will seem the same, because everything else will change along with it, but it really won't be. strain gauge scales will be able to tell, but most conventional scales won't.
That's why using atomic mass is more accurate. Eliminates the gravity field.
Definitely, I kind of figured they (BIPM, NIST, others) weren't weighing the artifacts (as these Kilogram prototypes are referred to as) due to just the effects (plus a WHOLE bunch of others!) you mentioned. I did find several very insightful PDF's by BIPM/NIST and you may be surprised at how they measure the mass while avoiding these effects.
They use a beam balance, which in principle is just like the ones we all used in high school!
Of course these beam balances are precision electro-mechanical-optical devices able to achieve accuracies of up to 1 part in a trillion (10^12), which to my surprise is still about 2 orders of magnitude higher than other current methods (atomic, etcetera). But if all goes according to plan, by 2011 or so they will be able to ditch these artifacts for other methods based on physically known constants.
To use the beam balance they use the the primary (plus several copies) and do weigh-offs with newly fabricated ones (i. e. when the balance is level the copy is good to go). Supersets of 1 Kg are fairly straight forward (in fact NIST makes them as large as 28,000 Kg!). Subsets are where they get a little more clever, for instance they step down to 1000+500+200+200+100 g unknowns and 2 1000 g knowns using a 6 step permutation sequence. And a similar procedure is used to obtain even smaller sizes, until you end up with a set of weights like the set of SS weights we all used in high school!
BTW, a metric tonne is just 1000 Kg of pure water which just happens to fit into a 1 meter cube, thus the original relationship between the Kilogram and Meter, and why rho = 1 g/cc.
PS - I know this is off topic, but I just couldn't resist, it's a subject area I've dealt with for over 30 years now (geometric, kinematic, and dynamic similitude AKA Froude scaling). I hope you'll excuse the interruption!
BTW, a metric tonne is just 1000 Kg of pure water which just happens to fit into a 1 meter cube, thus the original relationship between the Kilogram and Meter, and why rho = 1 g/cc.
Don't forget the temperature compensation.
Check it out!
http://www.aeroxp.org/board/index.php?showtopic=5209
There are a couple of screen shots showing more Leopard WWDC preview features on the Aero Experience forum, one of the images shows a semi complete vector UI!
Check it out!
http://www.aeroxp.org/board/index.php?showtopic=5209
Some cool shots on that site.