or Connect
AppleInsider › Forums › Software › Mac OS X › Resolution independence in Leopard confirmed by Apple
New Posts  All Forums:Forum Nav:

Resolution independence in Leopard confirmed by Apple - Page 4

post #121 of 185
Quote:
Originally Posted by shetline

200 ppi takes 4 times the video memory of 100 ppi and four times pretty much everything else like bandwidth and rendering overhead.

Let's not exagerate here...200ppi takes 2 times the video memory and two times the bandwidth compared to 100ppi.

edit: same with 300ppi...3 times...not 9. Basic math here. Also, the fact that the pixels are smaller may mean that a higher number of dead pixel may be tolerable. At 300ppi, it's starting to get difficult to really notice dead pixels.
post #122 of 185
Quote:
Originally Posted by kim kap sol

Let's not exagerate here...200ppi takes 2 times the video memory and two times the bandwidth compared to 100ppi.

edit: same with 300ppi...3 times...not 9. Basic math here.

The basic math you used is utterly wrong, unless your display is a single line. Did you fail geometry somewhere? ppi is a linear dimension, but that scales by the square for area. That's where the four and nine come in.

A 100ppi display has 10000 pixels per square inch. A 200ppi display has 40000 pixels per square inch. A 300ppi display would have 90000 pixels per square inch. That also means that the video memory and DVI bandwidth quadruples for 200ppi, 300ppi would probably require a new display standard or a major scaling of DVI so that its bandwidth is 9x or more than the current standard. Which is why the IBM T221 needed a quad-link DVI to get an adequate refresh rate.
post #123 of 185
Quote:
Originally Posted by kim kap sol

Let's not exagerate here...200ppi takes 2 times the video memory and two times the bandwidth compared to 100ppi.

edit: same with 300ppi...3 times...not 9. Basic math here. Also, the fact that the pixels are smaller may mean that a higher number of dead pixel may be tolerable. At 300ppi, it's starting to get difficult to really notice dead pixels.

Actually it is 4 times. 200 ppi vs 100 ppi is 4 times the number of pixels. Take a 1"x1" image for example. At 100 ppi that's 100 x 100 = 10,000 pixels. At 200 ppi that's 200 x 200 = 40,000 pixels. At 300 ppi that's 300 x 300 = 90,000 pixels. Basic math .
"Slow vehicle speeds with frequent stops would signal traffic congestion, for instance."

uh... it could also signal that my Mom is at the wheel...
Reply
"Slow vehicle speeds with frequent stops would signal traffic congestion, for instance."

uh... it could also signal that my Mom is at the wheel...
Reply
post #124 of 185
Quote:
Originally Posted by melgross

I simply don't agree with what you are saying.

People sitting a normal 24 inches from the screen are not going to resolve a 200ppi display. If you have to get to 12 inches or less to do so, it isn't useful.

Displays are not prints. You can resolve greater detail in prints than you can in a display.

There is a profound difference dealing with reflective media and emissive media. What is not noticeable in a reflective print is still noticeable in an emissive display. Basic retinal physics, the minimal resolvable dot is not only a function of focus but also a function of activation potential in the cones. More light equals a smaller dot being visually viable.

To this point displays have sucked resolution-wise compared to print media. Doing subjective image comparisons with sucky displays will make print media look stellar. Duh! Carefully controlled testing on minimum resolvability has used distance and high res (for the day) LCD displays to present physically smaller dots to the eye, and they are resolvable. You can argue with that all you want, but you aren't going to change physics and several decades of human factors research.
.
Reply
.
Reply
post #125 of 185
Quote:
Originally Posted by Hiro

There is a profound difference dealing with reflective media and emissive media. What is not noticeable in a reflective print is still noticeable in an emissive display. Basic retinal physics, the minimal resolvable dot is not only a function of focus but also a function of activation potential in the cones. More light equals a smaller dot being visually viable.

Frankly, I think that display brightnesses should be toned down anyway, so this point is moot. Except in daylight, I prefer to turn down the brightness all the way because they are too bright. Good paper returns 90% or more of the light that falls on it, and I don't want a display that's a lot brighter than the ambient light, and I don't want a screen that looks brighter than a normal piece of paper would in normal lighting.

Quote:
To this point displays have sucked resolution-wise compared to print media. Doing subjective image comparisons with sucky displays will make print media look stellar. Duh! Carefully controlled testing on minimum resolvability has used distance and high res (for the day) LCD displays to present physically smaller dots to the eye, and they are resolvable.

I don't know, in my experience, images that looked sharp on a screen look like crap when printed. Then there is the relative difference in how far away people hold a print vs. how far away most people set their monitor, which is about 2:1, making the resolution difference needed about 2:1 different.

Quote:
You can argue with that all you want, but you aren't going to change physics and several decades of human factors research.

Do you have specific names and such?
post #126 of 185
Sharp on screen but crap when printed makes sense because today's displays are lower res than printed material. Granted, there are certainly difference between reflected and emitted images. But we're still a long way from surpassing the resolutive power of the human eye/brain.

I've had the opportunity to look at experimental, high resolution LCDs. They are mind blowing to say the least. With some of these displays and normal viewing distances, it was hard to know if I was looking at printed material or not.

(Then the demo ended and convention goers were treated to a microscopic representation of WinXP. The start button looked like a speck of sand. I laughed my ass off at the IBM booth people trying to operate the machine at that point. )
post #127 of 185
So you've seen them too! The seeing is believing and those were 300ppi displays, same ones that Los Alamos contracted out for scientific data visualization.
.
Reply
.
Reply
post #128 of 185
Quote:
Originally Posted by JeffDM

I don't know, in my experience, images that looked sharp on a screen look like crap when printed. Then there is the relative difference in how far away people hold a print vs. how far away most people set their monitor, which is about 2:1, making the resolution difference needed about 2:1 different.

A lot depends on the resolution of the image. A low res image on a low res monitor can look fine, just like regular NTSC TV looks fine -- until you print the image out at several times the density or put the NTSC image on a HDTV.


Quote:
Do you have specific names and such?

I did some work in simulation visuals requirements that dealt with the physiology and cost tradeoffs 5-6 years ago. I'm not currently doing visuals research and my old junk is packed for a building move. Sorry, but I'm not going to spend time chasing my old references down.
.
Reply
.
Reply
post #129 of 185
Quote:
Originally Posted by Lemon Bon Bon.

"Do you have the $10 to $15 thousand to spare when one does come out?"

Have you ever been doused in petrol?

I can't see how monitors are any different to cameras, mobile phones, hard drives of a certain size, cpus...blue ray drives...they all command a premium at the outset. And in a few years cost peanuts. Monitors seem to have progressed by a glacial pace by comparison.

I guess when the hi-def 'revolution' pans out over the next 5-10 years...we might be pushing 200-300 dpi mainstream in computer monitors by then. I wish I could see an example to compare in all honesty.

I'd like to be able to see a 300 dpi picture at actual A4 size on a computer screen. I can only guess it means more pixels for a given area...does it mean more or less eye strain and does R.I help with that? A bit?

As is? Standard monitors. For art...pencil...drawing type stuff...it's hard going. Hurts my eyes for detail stuff. You have to 'zoom' in for details. With paper you just stick your face closer for detail stuff.

Shrugs. Can't beat a pencil and paper at the moment.

Lemon Bon Bon

That's right. You can't see.

When you know enough about why these monitors cost what they do, then come back and comment.

Right now, LCD monitors aren't even good enough for serious color work as it is.

The only monitor that is today, is the Samsung, which uses an LED backlight system. This unit is about 21" and 1600 x 1200 rez. It costs about $8,000.

Sure, it, and others will come down in price. But it isn't only those very expensive panels that will keep very high rez monitors from being affordable. It is also the associated electronics which is also better, as well as these new backlighting systems.

Will there be cheaper ones? Sure. There always are.

But, do you want to buy a cheap unit? Do you want a $200 21" (crt) monitor today for critical work? No. You buy a $900 dollar model.

For true critical color work today, what has a company bought?

A Barco. The cheap, personal version about $4,000. The good ones were up to $10,000.

Notice I said "were" Barco is now out of the prepress industry. No longer worthwile. Now, you have to look to the other companies still making them.

http://www.barco.com/
post #130 of 185
Hmmm, and anyone that doesn't have a $10000 display is doing "fake" color work? 99.99% of designers were probably mildly insulted by the last post.

Your point would probably be better made without the expensive hardware name-dropping.
post #131 of 185
Quote:
Originally Posted by dfiler

Hmmm, and anyone that doesn't have a $10000 display is doing "fake" color work? 99.99% of designers were probably mildly insulted by the last post.

Your point would probably be better made without the expensive hardware name-dropping.

Still, I didn't think that Lemon Bon Bon's post deserved the dignity of a reply.

When one wants advanced technology, one should at least be mindful of how long it would take to get down to affordable pricing. In this case, I think it would be several years at best. Not only does there have to be a good market demand to get the volume up, there has to be an ability to handle the graphics. The display models I mentioned only had like three possible video cards that can drive it adequately and was limited to about 42 Hz anyway, even with a quad-link DVI connection. The market is barely moving to put in dual link in a consumer system. Apple's Pro line and a few high-end gamer cards offer it right now. Quad link is hardly elegant as it is, it basically uses two DVI dual link or four single link DVI connectors.
post #132 of 185
Quote:
Originally Posted by dfiler

Hmmm, and anyone that doesn't have a $10000 display is doing "fake" color work? 99.99% of designers were probably mildly insulted by the last post.

Your point would probably be better made without the expensive hardware name-dropping.


If you go to the places where critical color work is produced, you will see those expensive monitors, from about $4,000 to $10,000. they are not high rez by the standards that we are talking about here. Usually 1600 x 1200.

Of course, not everyone can afford such displays, and my post was not an insult, unless you are inclined to be easily insulted.

Don't forget that Apple's first 22" LCD display was $4,000 when it first came out and was very popular at that price. And was considered to be cheap for what it offered. That was years ago. In today's dollar, it would be closer to $5,000.

Many photographers bought one. But, it was not acceptable for critical color.

For example, it couldn't be used for soft proofing. I'm assuming you know what that is.

The point I was making, was that with extremely expensive displays, there will be a very small market of users in fields that can afford such displays. those fields don't extend down to many computer users.

It will take a few years for those displays to reach a wider audience.

The purpose therefore, of my point, was to say that we shouldn't expect these displays to solve any possible problems of rez independence soon.

Ok?
post #133 of 185
"Holy crap, LBB, where have you been??"

Lurking. I was going to say Masturbating. But hey, I would have had to have been doing that for a loooong time.

I had to create a new account because I lost the password to my old account. (Yahoo shut down my old email account that had the pass word on it. If you want the full explanation.) I had 2k plus posts on my old LBB account. Most of them whining about the absence of a low end Mac. Or cheaper Macs. Bigger screened iMacs. Or better GL. Or the idea of a move to Intel. Most of which was torched by posters. But hey, it's all come to pass. DD I wonder if a mod can get it back to me. I just need the password...

Lemon Bon Bon

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #134 of 185
Quote:
Originally Posted by Lemon Bon Bon.

"Holy crap, LBB, where have you been??"

Lurking. I was going to say Masturbating. But hey, I would have had to have been doing that for a loooong time.

I had to create a new account because I lost the password to my old account. (Yahoo shut down my old email account that had the pass word on it. If you want the full explanation.) I had 2k plus posts on my old LBB account. Most of them whining about the absence of a low end Mac. Or cheaper Macs. Bigger screened iMacs. Or better GL. Or the idea of a move to Intel. Most of which was torched by posters. But hey, it's all come to pass. DD I wonder if a mod can get it back to me. I just need the password...

Lemon Bon Bon

I was wondering if we would be seeing a new you.
post #135 of 185
"Originally Posted by shetline
200 ppi takes 4 times the video memory of 100 ppi and four times pretty much everything else like bandwidth and rendering overhead.

Let's not exagerate here...200ppi takes 2 times the video memory and two times the bandwidth compared to 100ppi.

edit: same with 300ppi...3 times...not 9. Basic math here. Also, the fact that the pixels are smaller may mean that a higher number of dead pixel may be tolerable. At 300ppi, it's starting to get difficult to really notice dead pixels."

I'm beginning to see how things are stacking up, now. It looks like a bit of leap when you put things that way.

Still, with the advances in GPU power, I can't see things being such an issue in terms of GPU power to drive such DPI displays beasts. GPU power is still forging ahead with SLI, faster express lanes pending and maybe multi GPU stuff. Who knows. RI is at least coming with Leopard.

In plain English. Better explanation of what we're up against then the Melgross stuff. (Clearly blinded by his own enlightenment.)

"When you know enough about why these monitors cost what they do, then come back and comment."

Rubs Melgross on the head, 'It's alright, sonny, I'll stay here incite a little debate." And I may even learn something in the process.

I'm a consumer. And I'd like better, crisper, more detailed displays that don't burn my eyes. That's their (monitor companies...) problem to navigate. Not mine. I still want my 300 dpi displays that looks at good as a printed image. And I want 16 core cpu Monster Mac beast to go with it. And a octo-gpu monster to power it all. With 32 gigs of ram. And a 10 terrabyte hard drive.

Dee-da-dee-dee.

Lemon Bon Bon

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #136 of 185
"I was wondering if we would be seeing a new you."

Cryptic..? Ey? *Pauses...

(I still want to where Amorph disappeared to. I'm convinced Melgross is fronting as him...'it'.)

Well....I did consider...for a moment.

I do have 'Strawberry Avenger' as a back up. I'm not afraid to use it either. But still a squirt of lemon in the eye is as good as orange juice in the eye. Consider this proverb as we muse Leopard's new dawn of Resolution Independence. (Hey, it must be good because Apple are including it in Leopard...we just have to wait for everybody else to catch up...)

Lemon Bon Bon

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #137 of 185
Quote:
Originally Posted by Lemon Bon Bon.

"Originally Posted by shetline
200 ppi takes 4 times the video memory of 100 ppi and four times pretty much everything else like bandwidth and rendering overhead.

Let's not exagerate here...200ppi takes 2 times the video memory and two times the bandwidth compared to 100ppi.

edit: same with 300ppi...3 times...not 9. Basic math here. Also, the fact that the pixels are smaller may mean that a higher number of dead pixel may be tolerable. At 300ppi, it's starting to get difficult to really notice dead pixels."

I'm beginning to see how things are stacking up, now. It looks like a bit of leap when you put things that way.

Still, with the advances in GPU power, I can't see things being such an issue in terms of GPU power to drive such DPI displays beasts. GPU power is still forging ahead with SLI, faster express lanes pending and maybe multi GPU stuff. Who knows. RI is at least coming with Leopard.

In plain English. Better explanation of what we're up against then the Melgross stuff. (Clearly blinded by his own enlightenment.)

"When you know enough about why these monitors cost what they do, then come back and comment."

Rubs Melgross on the head, 'It's alright, sonny, I'll stay here incite a little debate." And I may even learn something in the process.

I'm a consumer. And I'd like better, crisper, more detailed displays that don't burn my eyes. That's their (monitor companies...) problem to navigate. Not mine. I still want my 300 dpi displays that looks at good as a printed image. And I want 16 core cpu Monster Mac beast to go with it. And a octo-gpu monster to power it all. With 32 gigs of ram. And a 10 terrabyte hard drive.

Dee-da-dee-dee.

Lemon Bon Bon

It's always the ones who know the least who are the most adept at making fun of others.

Your post proves that.

The math is wrong, as are the assumptions.

No one is arguing that you can't wish for something.
post #138 of 185
"Frankly, I think that display brightnesses should be toned down anyway, so this point is moot. Except in daylight, I prefer to turn down the brightness all the way because they are too bright. Good paper returns 90% or more of the light that falls on it, and I don't want a display that's a lot brighter than the ambient light, and I don't want a screen that looks brighter than a normal piece of paper would in normal lighting."

Interesting comment. I'd like to see a display that is as easy to look at as a piece of paper. Rather than retina scorch. I suppose there is the contrast and brightness buttons...but... Monitors like print. The holy grail?

Lemon Bon Bon

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #139 of 185
"It's always the ones who know the least who are the most adept at making fun of others."

And the ones who 'know the most' so poor in extracting their wisdom to the 'least'.

Even the stone tablets turned to dust in time. After no doubt dazzling the folks (who knew the least...) with their blinding light in the first instance.

Hmm. Er. Time to leave R.Indepence for one evening I think...

Lemon Bon Bon

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #140 of 185
Quote:
Originally Posted by JeffDM

The basic math you used is utterly wrong, unless your display is a single line. Did you fail geometry somewhere? ppi is a linear dimension, but that scales by the square for area. That's where the four and nine come in.

A 100ppi display has 10000 pixels per square inch. A 200ppi display has 40000 pixels per square inch. A 300ppi display would have 90000 pixels per square inch. That also means that the video memory and DVI bandwidth quadruples for 200ppi, 300ppi would probably require a new display standard or a major scaling of DVI so that its bandwidth is 9x or more than the current standard. Which is why the IBM T221 needed a quad-link DVI to get an adequate refresh rate.

Oh, yeah, sure...in a perfect world where pixels are perfectly square. :P
post #141 of 185
Lemon, your math is way off.

At 100 dpi, a square inch is 100 x 100 = 10,000 pixels.

At 200 dpi, a square inch is 200 x 200 = 40,000 pixels.

It's 4 times the bandwidth and memory.

300, likewise, is 9x.

Re: display that looks like paper... eventually electronic ink may get us there (I doubt it), but the problem has to do with the blacks. Currently, they're more of a gray... to get any kind of dynamic range, we need the brighter monitors (which in turn, of course, makes the grays a little bit brighter, too).
post #142 of 185
"Lemon, your math is way off."

Blush.

Eh. Those numbers were quoted from above. Not mine, ahem.

I haven't sat down and worked out whether they're right or wrong.

But I think the numbers are academic anyhow.

When you look at the outrageous bandwidth numbers of GPUS from where they've come from...given another 5 years of that kind of progress, I can't see GPUs being under powered for the job. So, the other factors probably come into play.

300 dpi monitors. To me, it's just a series of technical hurdles to get around. We'll get there one day. But with Hi-Def guaranteeing a glut of sales for LCD up to 2010, where's the incentive?

Is the Print, Post Production market on their own enough to drive us there in the short term? Given the above numbers and arguments. I guess we won't be seeing them anytime soon. (But at least Apple is building for the future.)

But one day...

Lemon Bon Bon

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #143 of 185
Quote:
Originally Posted by gregmightdothat

Re: display that looks like paper... eventually electronic ink may get us there (I doubt it), but the problem has to do with the blacks. Currently, they're more of a gray... to get any kind of dynamic range, we need the brighter monitors (which in turn, of course, makes the grays a little bit brighter, too).

A brighter backlight doesn't change the contrast ratio in a positive way that I have ever seen. Make the whites brighter, you'll make the blacks brighter too. In fact, if you get to something like video projectors, you'll often find that the brighter ones have worse contrast ratios, the ones that have a bright mode and an economy or cinema mode also have different contrast ratios based on their modes.

I can't say anything about electronic ink because I have not seen it. Frankly, I'm pretty happy with the contrast ratio of current LCD displays. At least, the blacks on my 30" ADC are so close to black that I think that it would take a pretty fussy person to complain about it. I might have been able to get darker on my CRTs but not without a lot of adjustment trouble and significant annoyance getting there.
post #144 of 185
Quote:
Originally Posted by JeffDM

A brighter backlight doesn't change the contrast ratio in a positive way that I have ever seen. Make the whites brighter, you'll make the blacks brighter too. In fact, if you get to something like video projectors, you'll often find that the brighter ones have worse contrast ratios, the ones that have a bright mode and an economy or cinema mode also have different contrast ratios based on their modes.

I can't say anything about electronic ink because I have not seen it. Frankly, I'm pretty happy with the contrast ratio of current LCD displays. At least, the blacks on my 30" ADC are so close to black that I think that it would take a pretty fussy person to complain about it. I might have been able to get darker on my CRTs but not without a lot of adjustment trouble and significant annoyance getting there.

Brighter displays are better when the ambient light level is high.

The other advantage is that if the display is a really good one, it can actually be calibrated to 5,000k, which is impossible with most displays.
post #145 of 185
Photoshop aside, I just want to get some clarification.

This is going to be a user adjustable setting like it is in Windows? Where for example if you up things to 120 DPI (or "Large Fonts" as Windows Calls it) everything gets bigger? fonts, titlebars, etc?

Except unlike Windows, OS X will do a pretty job with it?

I run Windows at 192 DPI. I'm visually impaired so this allows everything to be readable and huge, but still run my LCD's at their native resolution of 1600x1200 (I tried 800x600 @ 96dpi, but everything was fuzzy [as to be expected]).

This is the *one* setting OS X doesn't have over Windows (yet), so if this is what Apple Insider is talking about, Apple has another customer

- D
post #146 of 185
Quote:
Originally Posted by Daphoid View Post

Photoshop aside, I just want to get some clarification.

This is going to be a user adjustable setting like it is in Windows? Where for example if you up things to 120 DPI (or "Large Fonts" as Windows Calls it) everything gets bigger? fonts, titlebars, etc?

Except unlike Windows, OS X will do a pretty job with it?

I run Windows at 192 DPI. I'm visually impaired so this allows everything to be readable and huge, but still run my LCD's at their native resolution of 1600x1200 (I tried 800x600 @ 96dpi, but everything was fuzzy [as to be expected]).

This is the *one* setting OS X doesn't have over Windows (yet), so if this is what Apple Insider is talking about, Apple has another customer

- D

RI is very different. You can raise the size of the fonts in OS X as well, but that doesn't equate to RI.

RI is the ability to change the rez of the entire screen. In other words, everything will change in size equally.

But, it also changes the size of everything by interpolating it into a higher rez object. This is done by having most factors being composed of vector calculations, rather than using bitmaps.

Fonts are done this way now, which is why they look better, the larger they are. Icons on the Mac are done the same way (well, almost). There is interpolation going on. But, there is more than one size for the icon. Interpolation can average out the detail when changing sizes. Try to change the size of the screen icons by going to prefs. You'll see that they remain sharp. Possibly seeming to gain in detail.

RI will do that to most all elements of the desktop, or programs within, if they are rewritten to take advantage of it.

Pixel papped objects, such as photo's, are different. They can be interpolated as PS does it, but past a certain size, they lose sharpness. The memory required to store larger versions becomes immense.

Going the other way, that is, from "normal" to a smaller apparent screen rez, is the opposite, less memory would be required for bit mapped graphics.

Apple has modes that can give an idea of how this works, in an imperfect way.

You can go to Sys Prefs/Unlversal Access, and turn Zoom on. Follow the options. As I say, it isn't the same, but you will get some idea.
post #147 of 185
Resolution independence is a bit different that that. It has little to do with a user actively scaling something. Resolution independence is simply breaking the bit-mapped pixel related dependence of graphical objects.

For example a 120x120 pixel graphic (icon?) displayed as a bit map @ 100% will always be 120 pixels by 120 pixels. Now display that on an older 72dpi monitor and the graphic will be approximately 1.7 inches square [yes that's a bit old but the math is easy]. Now display the same graphic on a 160dpi iPhone screen, the same graphic now shrinks to 3/4 of an inch square. A resolution independent version would always stay at the same physical size regardless of the screen dpi resolution.

The big win is with the ever increasing dpi counts in monitors we wont need magnifying glasses or need to tell our OS to use some mongo-sized font to maintain a consistent appearance across all monitors. All that scaling and adjustment is done "behind the Green Curtain", no user intervention necessary.
.
Reply
.
Reply
post #148 of 185
Put simply, you will need to start thinking of you windows being X cm wide, rather than N pixels wide.
post #149 of 185
I won't understand a bit of this until I see it in action.
iPad2 16 GB Wifi

Who is worse? A TROLL or a person that feeds & quotes a TROLL? You're both idiots.....
Reply
iPad2 16 GB Wifi

Who is worse? A TROLL or a person that feeds & quotes a TROLL? You're both idiots.....
Reply
post #150 of 185
I understand the pixel mapped stuff, the same goes for Windows. When I enlarge the DPI, a 32x32 icon is very small on the screen, so I up the icon to 72x72 to fit the scale of everything else.

Basically what I'm trying to get at, in it's simplest form, well this be a user adjustable (one time, in sys prefs type thing) setting that will make everything appear bigger, or smaller? I realize certain things won't scale, but as long as I can make the major of the fonts and what not bigger, I should be ok

- D
post #151 of 185
Quote:
Originally Posted by Daphoid View Post

I understand the pixel mapped stuff, the same goes for Windows. When I enlarge the DPI, a 32x32 icon is very small on the screen, so I up the icon to 72x72 to fit the scale of everything else.

Basically what I'm trying to get at, in it's simplest form, well this be a user adjustable (one time, in sys prefs type thing) setting that will make everything appear bigger, or smaller? I realize certain things won't scale, but as long as I can make the major of the fonts and what not bigger, I should be ok

- D

That's the idea.
post #152 of 185
I just thought I'd throw in the fact that Apple has said all developars need to make their icons 512 x 512px's.

Like this;



Is that big enough for ya?
Citing unnamed sources with limited but direct knowledge of the rumoured device - Comedy Insider (Feb 2014)
Reply
Citing unnamed sources with limited but direct knowledge of the rumoured device - Comedy Insider (Feb 2014)
Reply
post #153 of 185
Quote:
Originally Posted by Daphoid View Post

I understand the pixel mapped stuff, the same goes for Windows. When I enlarge the DPI, a 32x32 icon is very small on the screen, so I up the icon to 72x72 to fit the scale of everything else.

Basically what I'm trying to get at, in it's simplest form, well this be a user adjustable (one time, in sys prefs type thing) setting that will make everything appear bigger, or smaller? I realize certain things won't scale, but as long as I can make the major of the fonts and what not bigger, I should be ok

I think it will be user adjustable. I've seen sample images of super-scaled OS X windows. It looked nice and smooth, not like using a low resolution setting on a high resolution screen.

The old windows system was such that you can increase all the font sizes, but that's it. The problem was that it broke a lot of dialogue boxes because the text ran off the edge, and picture sizes did not change, only font sizes. I think both Vista and Leopard both offer a user adjustable way that it scales everything up equally.
post #154 of 185
Quote:
Originally Posted by JeffDM View Post

I think it will be user adjustable. I've seen sample images of super-scaled OS X windows. It looked nice and smooth, not like using a low resolution setting on a high resolution screen.

Yeah.

Straight from the horse's mouth:



post #155 of 185
Quote:
Originally Posted by JeffDM View Post

I think it will be user adjustable. I've seen sample images of super-scaled OS X windows. It looked nice and smooth, not like using a low resolution setting on a high resolution screen.

The old windows system was such that you can increase all the font sizes, but that's it. The problem was that it broke a lot of dialogue boxes because the text ran off the edge, and picture sizes did not change, only font sizes. I think both Vista and Leopard both offer a user adjustable way that it scales everything up equally.

Big and doesn't mess up the dialogs, is exactly what I'm after

I suppose I'll have to wait and see what Leopord brings.

- D
post #156 of 185
Quote:
Originally Posted by dfiler View Post

Sharp on screen but crap when printed makes sense because today's displays are lower res than printed material. Granted, there are certainly difference between reflected and emitted images. But we're still a long way from surpassing the resolutive power of the human eye/brain.

I've had the opportunity to look at experimental, high resolution LCDs. They are mind blowing to say the least. With some of these displays and normal viewing distances, it was hard to know if I was looking at printed material or not.

(Then the demo ended and convention goers were treated to a microscopic representation of WinXP. The start button looked like a speck of sand. I laughed my ass off at the IBM booth people trying to operate the machine at that point. )

OW

i wish i could time travel back to 06 and say its still did not happen yet in 09 .

great read .
whats in a name ? 
beatles
Reply
whats in a name ? 
beatles
Reply
post #157 of 185
Quote:
Originally Posted by Hiro View Post

Resolution independence is a bit different that that. It has little to do with a user actively scaling something. Resolution independence is simply breaking the bit-mapped pixel related dependence of graphical objects.

For example a 120x120 pixel graphic (icon?) displayed as a bit map @ 100% will always be 120 pixels by 120 pixels. Now display that on an older 72dpi monitor and the graphic will be approximately 1.7 inches square [yes that's a bit old but the math is easy]. Now display the same graphic on a 160dpi iPhone screen, the same graphic now shrinks to 3/4 of an inch square. A resolution independent version would always stay at the same physical size regardless of the screen dpi resolution.

The big win is with the ever increasing dpi counts in monitors we wont need magnifying glasses or need to tell our OS to use some mongo-sized font to maintain a consistent appearance across all monitors. All that scaling and adjustment is done "behind the Green Curtain", no user intervention necessary.

This post explains it best .

Does hiro still post here ?

9
whats in a name ? 
beatles
Reply
whats in a name ? 
beatles
Reply
post #158 of 185
Quote:
Originally Posted by brucep View Post

Does hiro still post here ?

If you mean AI in general, yes. With over 700 posts here you didn't know how to check that?
post #159 of 185


i thought the thread title was "Resolution Independence in Snow Leopard confirmed by Apple"...you read what you want to read I guess.

Here's a link on spatial acuity and ppi:

http://aeronautics.arc.nasa.gov/asse...DEAC04_5-1.pdf

See the intensity section on the impact of intensity on visbility.
post #160 of 185
Quote:
Originally Posted by Hiro View Post

Resolution independence is a bit different that that. It has little to do with a user actively scaling something. Resolution independence is simply breaking the bit-mapped pixel related dependence of graphical objects.

For example a 120x120 pixel graphic (icon?) displayed as a bit map @ 100% will always be 120 pixels by 120 pixels. Now display that on an older 72dpi monitor and the graphic will be approximately 1.7 inches square [yes that's a bit old but the math is easy]. Now display the same graphic on a 160dpi iPhone screen, the same graphic now shrinks to 3/4 of an inch square. A resolution independent version would always stay at the same physical size regardless of the screen dpi resolution.

The big win is with the ever increasing dpi counts in monitors we wont need magnifying glasses or need to tell our OS to use some mongo-sized font to maintain a consistent appearance across all monitors. All that scaling and adjustment is done "behind the Green Curtain", no user intervention necessary.

That's essentially what I said. But with RI on a Mac, the actual rez of the monitor remains the same, but the use of that rez changes, not like going from a large screen to an iPhone sized screen.

On a 1600 x 1200 screen then, if you double the size (linearly) of the desktop elements, you end up with a desktop with less elements, which is what you would get if that same size screen was of 800 x 600 rez, but much sharper, as the full 1600 x 1200 is being used with vector graphics. Or at least, that's the concept. Not everything will be fully vector, so it won't be perfect.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Mac OS X
AppleInsider › Forums › Software › Mac OS X › Resolution independence in Leopard confirmed by Apple