Right, the lead browswer engineer that works on this stuff daily has less understanding of browser measurements and how they work than a guy who once worked in publishing.
That's a wiseguy remark, and shows that you know nothing. I worked in publishing for far more years than he has been doing his work. I owned a commercial photo lab for 28 years that did a great deal of publishing work for majot publications, as well as broadcast stations. We also did more than a bit of web design. I've worked with Adobe since 1992 as well.
Perhaps you should learn more about people.
Quote:
Absolute measurements work great when the dpi stays the same. Unfortunately they don't, and Hyatt address this by stating they assume 96 dpi when making browsers. Assuming 96 dpi when it is actually 72 or 120 throws you nice cm is a cm right out the window.
The problem with this is that there has rarely been a monitor that has actually BEEN 96 dpi. even when there has been, people haven't always run them at the only rez that would work out to 96 dpi. That is why he, and other people doing this work, have had problems. He can't assume anything. That's what I'm saying.
Quote:
Now with Resolution Independence and your dpi value, you sure could make a cm a cm. The problem is how does your system figure out the dpi of your screen? Currently all your system knows is the number of pixels in your display. It may change in the future that your system can query your display for its dpi, but as a browser maker working with current hardware, Hyatt knows that he cannot ensure a cm actually displays as a cm.
WHEN we get rez independence, things might change. But until then all of the problems remain.
Each monitor, at any given rez, has pixels of a given size. But change the monitor, or the rez, and the pixel size changes.
Why are we discussing this at all?
There is not one person on this board who does not understand this.
You understand it as well, intuitively, but you don't seem to realize it.
Fully understood.
But changing the monitor resolution is just another SCALE. Half the resolution and you're using twice the number of underlying physical pixels (in each direction) used previously! Obviously the OS knows the monitor resolution, and once the OS knows about the physical space those pixels occupy, there's no need to ASSUME PPI (or DPI), no need to "Super Size" your fonts, and no real need to change the monitor resolution from it's maximum (although do it if it makes you feel good), ASSUMING resolution independence works 100% of the time (and I think it will).
Getting into arcane discussions about em's, print points, and "Super Sized" fonts, just complicates, what seems to me to be a simple straight forward issue. It's just a multiplier with a known dimension and known length scale.
It's just such a simple 2D geometry problem ONCE your OS knows the physical length scale.
Why are we discussing this at all? Exactly!
That was my first thought when this discussion started to focus on resolution independence, and someone wanted to negate the physical length scale, and IF a commonly understood length scale was absolute (in our frame of reference (i. e. flat space)) or not!
Fathom these questions. What is the only unit of measurement still used today that has a physical analog that needs to be periodically cleaned? And why is it a necessary physical analog? And how was this unit of measurement originally derived? Hint to the first question, go to the BIPM website. Hint to the second question, if you can't figure this one out, I give up! Hint to the third question, metric tonne.
That's right. It's why one can't do that for electronic publications.
PDF is already most of the way there. PDFs can hold a lot of detail in raster and vector form and they generally look good at any scaling level. All the reader software would need is the screen's pixel density. For LCDs, that would be just one more number to track in an OS database.
But changing the monitor resolution is just another SCALE. Half the resolution and you're using twice the number of underlying physical pixels (in each direction) used previously! Obviously the OS knows the monitor resolution, and once the OS knows about the physical space those pixels occupy, there's no need to ASSUME PPI (or DPI), no need to "Super Size" your fonts, and no real need to change the monitor resolution from it's maximum (although do it if it makes you feel good), ASSUMING resolution independence works 100% of the time (and I think it will).
Getting into arcane discussions about em's, print points, and "Super Sized" fonts, just complicates, what seems to me to be a simple straight forward issue. It's just a multiplier with a known dimension and known length scale.
It's just such a simple 2D geometry problem ONCE your OS knows the physical length scale.
Why are we discussing this at all? Exactly!
That was my first thought when this discussion started to focus on resolution independence, and someone wanted to negate the physical length scale, and IF a commonly understood length scale was absolute (in our frame of reference (i. e. flat space)) or not!
Fathom these questions. What is the only unit of measurement still used today that has a physical analog that needs to be periodically cleaned? And why is it a necessary physical analog? And how was this unit of measurement originally derived? Hint to the first question, go to the BIPM website. Hint to the second question, if you can't figure this one out, I give up! Hint to the third question, metric tonne.
Unfortunately, we seem to be discussing this because of flawed reasoning.
The OS knows nothing about the size of those pixels. Nothing at all. It doesn't know the size of the monitor. Apple used to assume, a long time ago, when their own monitors had a sensing line going to the monitor, that that monitor was a given size. But, that hasn't been true for quite some time. It stopped being true way back when multi-sync monitors were invented by NEC.
I know quite a few people who don't run their monitor at its nominal rez.
No assumptions are possible.
Even if rez independence comes about, the settings will have to be done manually.
you ate wrong about the physical analog. Those are obsolete. Have been for a while. Length is now spec'd as the length of atoms, or a multiple of Plank time as the distance light travels,the number of virbrations of certain atoms in a spec'd length of time, etc. All very absolute. At least for as long as the human race is concerned.
All metric measurements are related to each other. They are fixed.
PDF is already most of the way there. PDFs can hold a lot of detail in raster and vector form and they generally look good at any scaling level. All the reader software would need is the screen's pixel density. For LCDs, that would be just one more number to track in an OS database.
Yes, but the actual size of any feature of a PDF only holds true at the correct viewing size for that document. If the page is 8.5 x 11, then it must be viewed at that size. That's the entire, and only problem, we're having here.
There are several notebooks that run at about 150, just over twice the linear density as the original assumptions about display pixel density.
But it's also been written by numerous experts in the field that 150 is about as high as is required, as we can't even see finer screen details. even now, with 110, or so, it can be difficult to see all the detail without moving very close to the screen and squinting.
The smaller the pixels, the more numerous will be the dead ones, or the ones that are always turned on. That raises the cost of the panels as well.
There might always be a market for very high rez panels, but that market will be very small, and the panels will always cost over much for virtually everyone except for military, and scientific use. Not our concern at all.
How about Mac OS X detects your display resolution, and then asks you for the viewable size of the display, and recommends a UI scale?
That would be great. Likely, the only way it could be done.
The only other way is one in which the manufacturers would all agree on standards. Each size monitor would have only certain resolutions. The problem would be that laptops have higher rez's than desktop models, and who believes that companies will agree to this anyway?
With 17, 18, 19, 20, 21, 22, 23, AND 24 inch models, it's going to be impossible. Then there are standard 4:3 displays, and 16:10 displays. go to Tv monitors, and one has 16:9.
It's not very good user experience for the computer to ask the user what size the monitor is, though.
How about if Apple maintains a list of display models and their sizes? The model can probably be found out automatically.
Actually, judging from Wikipedia's EDID page, the maximum viewable size is indeed given. Of course, it's a question of how many screens really support that.
It's not very good user experience for the computer to ask the user what size the monitor is, though.
How about if Apple maintains a list of display models and their sizes? The model can probably be found out automatically.
Actually, judging from Wikipedia's EDID page, the maximum viewable size is indeed given. Of course, it's a question of how many screens really support that.
That's nice, but I don't know of any package that really supports that. I don't think most (or any?) cards do either.
Unfortunately, we seem to be discussing this because of flawed reasoning.
The OS knows nothing about the size of those pixels. Nothing at all. It doesn't know the size of the monitor. Apple used to assume, a long time ago, when their own monitors had a sensing line going to the monitor, that that monitor was a given size. But, that hasn't been true for quite some time. It stopped being true way back when multi-sync monitors were invented by NEC.
I know quite a few people who don't run their monitor at its nominal rez.
No assumptions are possible.
Even if rez independence comes about, the settings will have to be done manually.
you ate wrong about the physical analog. Those are obsolete. Have been for a while. Length is now spec'd as the length of atoms, or a multiple of Plank time as the distance light travels,the number of virbrations of certain atoms in a spec'd length of time, etc. All very absolute. At least for as long as the human race is concerned.
All metric measurements are related to each other. They are fixed.
I am quite sure that the linear equation is correct. I am quite sure that my empirical measurements (of my Planar PL2010M) LCD are correct (within my measurement error estimate of 0.3%) and agree with the manfacturer's data (for 3V:4H and 4V:5H aspect ratios). It works no matter what resolution you set your display at, you always end up with 2 PPI values (H and V) and 2 lengths (H and V) that define it's effective pixel density and the viewable display area, respectively. For example, on my LCD, it's 404 mm H and 303 mm V, and has a native pixel density of 99.6 PPI (or 1600H bt 1200V), from using this data alone I can calculate all possible PPI's for all resolutions, with just simple linear math (now if the physical size of your display area changes (like in the good old CRT days) this also has to be incorporated into the effective PPI calculation, but it didn't for my LCD). Theory + empirical data = calabrated model, model vaildaion requires at least one additional empirical data set, but this squation is so simple that I don't see how it's possible that it's wrong.
How Apple (or anyone else) would impliment this is up to them. And if it works like I believe it could, why change resolution from the maximum native, since you're now doing in SW what is currently done through HW (via SW interface) for a finite fixed set of HW resolutions, and I believe you will preserve the most detail at the maximum resolution. I've already suggested either; 1) a simple manual method (which I'm sure that Apple could simplify to a simple on screen square and plastic templete, a la the manual ColorSync approach), or 2) a display database (which I'm sure Apple will do at least for their displays). In the end, and being an engineer, I don't much care how Apple does it, just give me a knob to turn! 8)
It addresses the first question, you need a little reverse engineering logic and Einstein's E=Mc^2 to realize the second, and some very basic math solves the third.
I am quite sure that the linear equation is correct. I am quite sure that my empirical measurements (of my Planar PL2010M) LCD are correct (within my measurement error estimate of 0.3%) and agree with the manfacturer's data (for 3V:4H and 4V:5H aspect ratios). It works no matter what resolution you set your display at, you always end up with 2 PPI values (H and V) and 2 lengths (H and V) that define it's effective pixel density and the viewable display area, respectively. For example, on my LCD, it's 404 mm H and 303 mm V, and has a native pixel density of 99.6 PPI (or 1600H bt 1200V), from using this data alone I can calculate all possible PPI's for all resolutions, with just simple linear math (now if the physical size of your display area changes (like in the good old CRT days) this also has to be incorporated into the effective PPI calculation, but it didn't for my LCD). Theory + empirical data = calabrated model, model vaildaion requires at least one additional empirical data set, but this squation is so simple that I don't see how it's possible that it's wrong.
How Apple (or anyone else) would impliment this is up to them. And if it works like I believe it could, why change resolution from the maximum native, since you're now doing in SW what is currently done through HW (via SW interface) for a finite fixed set of HW resolutions, and I believe you will preserve the most detail at the maximum resolution. I've already suggested either; 1) a simple manual method (which I'm sure that Apple could simplify to a simple on screen square and plastic templete, a la the manual ColorSync approach), or 2) a display database (which I'm sure Apple will do at least for their displays). In the end, and being an engineer, I don't much care how Apple does it, just give me a knob to turn! 8)
It addresses the first question, you need a little reverse engineering logic and Einstein's E=Mc^2 to realize the second, and some very basic math solves the third.
I did say that it woukld have to be done manually. I've done it myself many times. But, as you noted, the ppi (people, please stop saying dpi for monitors!) is 99.6. An odd size.
You're correct about the gram, but scientific establishments that can do so have already abandoned the physical unit.
I did say that it woukld have to be done manually. I've done it myself many times. But, as you noted, the ppi (people, please stop saying dpi for monitors!) is 99.6. An odd size.
You're correct about the gram, but scientific establishments that can do so have already abandoned the physical unit.
Yes, using Einstein's E=Mc^2 and solving for M, you learn something new every day, but I think this only applies to atomic particles, and the empirical methods (particle accelerators) used to determine the energy of these particles (yes theoretical models exist that predict the mass of these partivles, but without observational data are virtually useless, and all theories come from someone observing something unexplained). But it still has the units of mass (i. e. the kilogram or amu or whatever multiplier they choose to use]. don't know how this would apply to aggregates of particles, thus you've pretty much answered the second question (i. e. destroy the very thing you're trying to measure, I don't think I'll stand next to that scale)!
With 17, 18, 19, 20, 21, 22, 23, AND 24 inch models, it's going to be impossible. Then there are standard 4:3 displays, and 16:10 displays. go to Tv monitors, and one has 16:9.
Try to make sense out of all this.
This is the stuff that Apple seems to excel at. Making sense of something and making it simple for us mere mortals.
I think some of you guys are so deep into the forest that you can't see the trees.
Yes, using Einstein's E=Mc^2 and solving for M, you learn something new every day, but I think this only applies to atomic particles, and the empirical methods (particle accelerators) used to determine the energy of these particles (yes theoretical models exist that predict the mass of these partivles, but without observational data are virtually useless, and all theories come from someone observing something unexplained). But it still has the units of mass (i. e. the kilogram or amu or whatever multiplier they choose to use]. don't know how this would apply to aggregates of particles, thus you've pretty much answered the second question (i. e. destroy the very thing you're trying to measure, I don't think I'll stand next to that scale)!
Heh heh. Don't forget that mass and weight are two different things.
If we really want to be accurate, the only place where the gram weighs a gram, accurate to just so many decimal places, is where is was determined in the first place. Move it a few miles, and the gravity is different. It will seem the same, because everything else will change along with it, but it really won't be. strain gauge scales will be able to tell, but most conventional scales won't.
That's why using atomic mass is more accurate. Eliminates the gravity field.
Comments
Right, the lead browswer engineer that works on this stuff daily has less understanding of browser measurements and how they work than a guy who once worked in publishing.
That's a wiseguy remark, and shows that you know nothing. I worked in publishing for far more years than he has been doing his work. I owned a commercial photo lab for 28 years that did a great deal of publishing work for majot publications, as well as broadcast stations. We also did more than a bit of web design. I've worked with Adobe since 1992 as well.
Perhaps you should learn more about people.
Absolute measurements work great when the dpi stays the same. Unfortunately they don't, and Hyatt address this by stating they assume 96 dpi when making browsers. Assuming 96 dpi when it is actually 72 or 120 throws you nice cm is a cm right out the window.
The problem with this is that there has rarely been a monitor that has actually BEEN 96 dpi. even when there has been, people haven't always run them at the only rez that would work out to 96 dpi. That is why he, and other people doing this work, have had problems. He can't assume anything. That's what I'm saying.
Now with Resolution Independence and your dpi value, you sure could make a cm a cm. The problem is how does your system figure out the dpi of your screen? Currently all your system knows is the number of pixels in your display. It may change in the future that your system can query your display for its dpi, but as a browser maker working with current hardware, Hyatt knows that he cannot ensure a cm actually displays as a cm.
WHEN we get rez independence, things might change. But until then all of the problems remain.
Where is the evidence that pixel pitch is going to get much denser in the future?
Veery little.
No! Not at all.
Each monitor, at any given rez, has pixels of a given size. But change the monitor, or the rez, and the pixel size changes.
Why are we discussing this at all?
There is not one person on this board who does not understand this.
You understand it as well, intuitively, but you don't seem to realize it.
Fully understood.
But changing the monitor resolution is just another SCALE. Half the resolution and you're using twice the number of underlying physical pixels (in each direction) used previously! Obviously the OS knows the monitor resolution, and once the OS knows about the physical space those pixels occupy, there's no need to ASSUME PPI (or DPI), no need to "Super Size" your fonts, and no real need to change the monitor resolution from it's maximum (although do it if it makes you feel good), ASSUMING resolution independence works 100% of the time (and I think it will).
Getting into arcane discussions about em's, print points, and "Super Sized" fonts, just complicates, what seems to me to be a simple straight forward issue. It's just a multiplier with a known dimension and known length scale.
It's just such a simple 2D geometry problem ONCE your OS knows the physical length scale.
Why are we discussing this at all? Exactly!
That was my first thought when this discussion started to focus on resolution independence, and someone wanted to negate the physical length scale, and IF a commonly understood length scale was absolute (in our frame of reference (i. e. flat space)) or not!
Fathom
That's right. It's why one can't do that for electronic publications.
PDF is already most of the way there. PDFs can hold a lot of detail in raster and vector form and they generally look good at any scaling level. All the reader software would need is the screen's pixel density. For LCDs, that would be just one more number to track in an OS database.
Where is the evidence that pixel pitch is going to get much denser in the future?
There are several notebooks that run at about 150, just over twice the linear density as the original assumptions about display pixel density.
Fully understood.
But changing the monitor resolution is just another SCALE. Half the resolution and you're using twice the number of underlying physical pixels (in each direction) used previously! Obviously the OS knows the monitor resolution, and once the OS knows about the physical space those pixels occupy, there's no need to ASSUME PPI (or DPI), no need to "Super Size" your fonts, and no real need to change the monitor resolution from it's maximum (although do it if it makes you feel good), ASSUMING resolution independence works 100% of the time (and I think it will).
Getting into arcane discussions about em's, print points, and "Super Sized" fonts, just complicates, what seems to me to be a simple straight forward issue. It's just a multiplier with a known dimension and known length scale.
It's just such a simple 2D geometry problem ONCE your OS knows the physical length scale.
Why are we discussing this at all? Exactly!
That was my first thought when this discussion started to focus on resolution independence, and someone wanted to negate the physical length scale, and IF a commonly understood length scale was absolute (in our frame of reference (i. e. flat space)) or not!
Fathom
Unfortunately, we seem to be discussing this because of flawed reasoning.
The OS knows nothing about the size of those pixels. Nothing at all. It doesn't know the size of the monitor. Apple used to assume, a long time ago, when their own monitors had a sensing line going to the monitor, that that monitor was a given size. But, that hasn't been true for quite some time. It stopped being true way back when multi-sync monitors were invented by NEC.
I know quite a few people who don't run their monitor at its nominal rez.
No assumptions are possible.
Even if rez independence comes about, the settings will have to be done manually.
you ate wrong about the physical analog. Those are obsolete. Have been for a while. Length is now spec'd as the length of atoms, or a multiple of Plank time as the distance light travels,the number of virbrations of certain atoms in a spec'd length of time, etc. All very absolute. At least for as long as the human race is concerned.
All metric measurements are related to each other. They are fixed.
PDF is already most of the way there. PDFs can hold a lot of detail in raster and vector form and they generally look good at any scaling level. All the reader software would need is the screen's pixel density. For LCDs, that would be just one more number to track in an OS database.
Yes, but the actual size of any feature of a PDF only holds true at the correct viewing size for that document. If the page is 8.5 x 11, then it must be viewed at that size. That's the entire, and only problem, we're having here.
There are several notebooks that run at about 150, just over twice the linear density as the original assumptions about display pixel density.
But it's also been written by numerous experts in the field that 150 is about as high as is required, as we can't even see finer screen details. even now, with 110, or so, it can be difficult to see all the detail without moving very close to the screen and squinting.
The smaller the pixels, the more numerous will be the dead ones, or the ones that are always turned on. That raises the cost of the panels as well.
There might always be a market for very high rez panels, but that market will be very small, and the panels will always cost over much for virtually everyone except for military, and scientific use. Not our concern at all.
How about Mac OS X detects your display resolution, and then asks you for the viewable size of the display, and recommends a UI scale?
That would be great. Likely, the only way it could be done.
The only other way is one in which the manufacturers would all agree on standards. Each size monitor would have only certain resolutions. The problem would be that laptops have higher rez's than desktop models, and who believes that companies will agree to this anyway?
With 17, 18, 19, 20, 21, 22, 23, AND 24 inch models, it's going to be impossible. Then there are standard 4:3 displays, and 16:10 displays. go to Tv monitors, and one has 16:9.
Try to make sense out of all this.
How about if Apple maintains a list of display models and their sizes? The model can probably be found out automatically.
Actually, judging from Wikipedia's EDID page, the maximum viewable size is indeed given. Of course, it's a question of how many screens really support that.
It's not very good user experience for the computer to ask the user what size the monitor is, though.
How about if Apple maintains a list of display models and their sizes? The model can probably be found out automatically.
Actually, judging from Wikipedia's EDID page, the maximum viewable size is indeed given. Of course, it's a question of how many screens really support that.
That's nice, but I don't know of any package that really supports that. I don't think most (or any?) cards do either.
Unfortunately, we seem to be discussing this because of flawed reasoning.
The OS knows nothing about the size of those pixels. Nothing at all. It doesn't know the size of the monitor. Apple used to assume, a long time ago, when their own monitors had a sensing line going to the monitor, that that monitor was a given size. But, that hasn't been true for quite some time. It stopped being true way back when multi-sync monitors were invented by NEC.
I know quite a few people who don't run their monitor at its nominal rez.
No assumptions are possible.
Even if rez independence comes about, the settings will have to be done manually.
you ate wrong about the physical analog. Those are obsolete. Have been for a while. Length is now spec'd as the length of atoms, or a multiple of Plank time as the distance light travels,the number of virbrations of certain atoms in a spec'd length of time, etc. All very absolute. At least for as long as the human race is concerned.
All metric measurements are related to each other. They are fixed.
I am quite sure that the linear equation is correct. I am quite sure that my empirical measurements (of my Planar PL2010M) LCD are correct (within my measurement error estimate of 0.3%) and agree with the manfacturer's data (for 3V:4H and 4V:5H aspect ratios). It works no matter what resolution you set your display at, you always end up with 2 PPI values (H and V) and 2 lengths (H and V) that define it's effective pixel density and the viewable display area, respectively. For example, on my LCD, it's 404 mm H and 303 mm V, and has a native pixel density of 99.6 PPI (or 1600H bt 1200V), from using this data alone I can calculate all possible PPI's for all resolutions, with just simple linear math (now if the physical size of your display area changes (like in the good old CRT days) this also has to be incorporated into the effective PPI calculation, but it didn't for my LCD). Theory + empirical data = calabrated model, model vaildaion requires at least one additional empirical data set, but this squation is so simple that I don't see how it's possible that it's wrong.
How Apple (or anyone else) would impliment this is up to them. And if it works like I believe it could, why change resolution from the maximum native, since you're now doing in SW what is currently done through HW (via SW interface) for a finite fixed set of HW resolutions, and I believe you will preserve the most detail at the maximum resolution. I've already suggested either; 1) a simple manual method (which I'm sure that Apple could simplify to a simple on screen square and plastic templete, a la the manual ColorSync approach), or 2) a display database (which I'm sure Apple will do at least for their displays). In the end, and being an engineer, I don't much care how Apple does it, just give me a knob to turn! 8)
WRT my questions, please see the following link;
Kilogram
It addresses the first question, you need a little reverse engineering logic and Einstein's E=Mc^2 to realize the second, and some very basic math solves the third.
I am quite sure that the linear equation is correct. I am quite sure that my empirical measurements (of my Planar PL2010M) LCD are correct (within my measurement error estimate of 0.3%) and agree with the manfacturer's data (for 3V:4H and 4V:5H aspect ratios). It works no matter what resolution you set your display at, you always end up with 2 PPI values (H and V) and 2 lengths (H and V) that define it's effective pixel density and the viewable display area, respectively. For example, on my LCD, it's 404 mm H and 303 mm V, and has a native pixel density of 99.6 PPI (or 1600H bt 1200V), from using this data alone I can calculate all possible PPI's for all resolutions, with just simple linear math (now if the physical size of your display area changes (like in the good old CRT days) this also has to be incorporated into the effective PPI calculation, but it didn't for my LCD). Theory + empirical data = calabrated model, model vaildaion requires at least one additional empirical data set, but this squation is so simple that I don't see how it's possible that it's wrong.
How Apple (or anyone else) would impliment this is up to them. And if it works like I believe it could, why change resolution from the maximum native, since you're now doing in SW what is currently done through HW (via SW interface) for a finite fixed set of HW resolutions, and I believe you will preserve the most detail at the maximum resolution. I've already suggested either; 1) a simple manual method (which I'm sure that Apple could simplify to a simple on screen square and plastic templete, a la the manual ColorSync approach), or 2) a display database (which I'm sure Apple will do at least for their displays). In the end, and being an engineer, I don't much care how Apple does it, just give me a knob to turn! 8)
WRT my questions, please see the following link;
Kilogram
It addresses the first question, you need a little reverse engineering logic and Einstein's E=Mc^2 to realize the second, and some very basic math solves the third.
I did say that it woukld have to be done manually. I've done it myself many times. But, as you noted, the ppi (people, please stop saying dpi for monitors!) is 99.6. An odd size.
You're correct about the gram, but scientific establishments that can do so have already abandoned the physical unit.
I did say that it woukld have to be done manually. I've done it myself many times. But, as you noted, the ppi (people, please stop saying dpi for monitors!) is 99.6. An odd size.
You're correct about the gram, but scientific establishments that can do so have already abandoned the physical unit.
Yes, using Einstein's E=Mc^2 and solving for M, you learn something new every day, but I think this only applies to atomic particles, and the empirical methods (particle accelerators) used to determine the energy of these particles (yes theoretical models exist that predict the mass of these partivles, but without observational data are virtually useless, and all theories come from someone observing something unexplained). But it still has the units of mass (i. e. the kilogram or amu or whatever multiplier they choose to use]. don't know how this would apply to aggregates of particles, thus you've pretty much answered the second question (i. e. destroy the very thing you're trying to measure, I don't think I'll stand next to that scale)!
With 17, 18, 19, 20, 21, 22, 23, AND 24 inch models, it's going to be impossible. Then there are standard 4:3 displays, and 16:10 displays. go to Tv monitors, and one has 16:9.
Try to make sense out of all this.
This is the stuff that Apple seems to excel at. Making sense of something and making it simple for us mere mortals.
I think some of you guys are so deep into the forest that you can't see the trees.
Yes, using Einstein's E=Mc^2 and solving for M, you learn something new every day, but I think this only applies to atomic particles, and the empirical methods (particle accelerators) used to determine the energy of these particles (yes theoretical models exist that predict the mass of these partivles, but without observational data are virtually useless, and all theories come from someone observing something unexplained). But it still has the units of mass (i. e. the kilogram or amu or whatever multiplier they choose to use]. don't know how this would apply to aggregates of particles, thus you've pretty much answered the second question (i. e. destroy the very thing you're trying to measure, I don't think I'll stand next to that scale)!
Heh heh. Don't forget that mass and weight are two different things.
If we really want to be accurate, the only place where the gram weighs a gram, accurate to just so many decimal places, is where is was determined in the first place. Move it a few miles, and the gravity is different. It will seem the same, because everything else will change along with it, but it really won't be. strain gauge scales will be able to tell, but most conventional scales won't.
That's why using atomic mass is more accurate. Eliminates the gravity field.
This is the stuff that Apple seems to excel at. Making sense of something and making it simple for us mere mortals.
I think some of you guys are so deep into the forest that you can't see the trees.
No. we're so deep into the work we do, that we're tired of the confusion.
Apple hasn't helped in the slightest.
I also assume that you got the expression mixed up on purpose.