Resolution independence in Leopard confirmed by Apple
Apple Computer's next-generation Mac OS X Leopard operating system will indeed make use of resolution independence, the company confirmed in a posting to its developer website.
The technology, which has been lingering beneath the surface of Mac OS X since early betas of Mac OS X Tiger, essentially breaks the software assumption that all display output is to be rendered at 72 dots per inch (DPI).
"The old assumption that displays are 72dpi has been rendered obsolete by advances in display technology," Apple said in an overview of Leopard posted to its Developer Connection website. "Macs now ship with displays that sport displays with native resolutions of 100dpi or better."
The Cupertino, Calif.-based company said the number of pixels per inch will continue to increase dramatically over the next few years, making displays crisper and smoother.
"But it also means that interfaces that are pixel-based will shrink to the point of being unusable," Apple said. "The solution is to remove the 72dpi assumption that has been the norm. In Leopard, the system, including the Carbon and Cocoa frameworks, will be able to draw user interface elements using a scale factor."
The technology will allow the Mac OS X user interface to maintain the same physical size while gaining resolution and crispness from high dpi displays.
"The introduction of resolution independence may mean that there is work that you?ll need to do in order to make your application look as good as possible," the Mac maker told its developers. "For modern Cocoa and Carbon applications, most of the work will center around raster-based resources. For older applications that use QuickDraw, more work will be required to replace QuickDraw-based calls with Quartz ones."
Apple's intention to support resolution independence with Mac OS X was first detailed in an Aug. 2004 AppleInsider report. Following the release of Leopard Preview in Aug. of this year, bloggers discovered that the feature would be made more accessible to developers working with Leopard.
The Quartz-driven technology will allow developers to author applications that offer users the choice of viewing more detail (more pixels per point, but fewer points on the screen) or a larger user interface (fewer pixels per point but more points on the screen) without altering the resolution of the computer's display.
The technology, which has been lingering beneath the surface of Mac OS X since early betas of Mac OS X Tiger, essentially breaks the software assumption that all display output is to be rendered at 72 dots per inch (DPI).
"The old assumption that displays are 72dpi has been rendered obsolete by advances in display technology," Apple said in an overview of Leopard posted to its Developer Connection website. "Macs now ship with displays that sport displays with native resolutions of 100dpi or better."
The Cupertino, Calif.-based company said the number of pixels per inch will continue to increase dramatically over the next few years, making displays crisper and smoother.
"But it also means that interfaces that are pixel-based will shrink to the point of being unusable," Apple said. "The solution is to remove the 72dpi assumption that has been the norm. In Leopard, the system, including the Carbon and Cocoa frameworks, will be able to draw user interface elements using a scale factor."
The technology will allow the Mac OS X user interface to maintain the same physical size while gaining resolution and crispness from high dpi displays.
"The introduction of resolution independence may mean that there is work that you?ll need to do in order to make your application look as good as possible," the Mac maker told its developers. "For modern Cocoa and Carbon applications, most of the work will center around raster-based resources. For older applications that use QuickDraw, more work will be required to replace QuickDraw-based calls with Quartz ones."
Apple's intention to support resolution independence with Mac OS X was first detailed in an Aug. 2004 AppleInsider report. Following the release of Leopard Preview in Aug. of this year, bloggers discovered that the feature would be made more accessible to developers working with Leopard.
The Quartz-driven technology will allow developers to author applications that offer users the choice of viewing more detail (more pixels per point, but fewer points on the screen) or a larger user interface (fewer pixels per point but more points on the screen) without altering the resolution of the computer's display.
Comments
When someone showed me what resolution independence in Windows XP looked like, I was not impressed at all. Its fine for Apple to wait for the technology to mature and actually become usable.
When someone showed me what resolution independence in Windows XP looked like, I was not impressed at all. Its fine for Apple to wait for the technology to mature and actually become usable.
MS can often claim they've done something first and that others copied...but MS puts out so many inept implementations that it would be stupid to say MS put it out first.
It's always the same old story on Mac/PC boards such as Ars Technica. "MS had Exposé first". This last example is someone talking about the Windows feature that allows one to tile windows on screen without any overlaps. Can you imagine? There are countless examples where people compare shitty implementations to OS X implementations and claim they were the first to do it.
Apple will get it right...it won't bring us some dastardly interface that offers a few different sizes with pixelated icons and window content that doesn't even scale with the rest of the interface widgets.
"Leopard also provides a dramatic increase in OpenGL performance by offloading CPU-based processing onto another thread which can then run on a separate CPU core feeding the GPU. This can increase, or in some cases, even double the performance of OpenGL-based applications."
With the new Creative Suite use this tech (resolution independence; OpenGL) or will Adobe maintain the single codebase and try to recreate it all themselves?
How or why would Creative Suite leverage this?
Just thinking out loud here, but Adobe has been prone to almost ignoring advances in MacOS for several generations due to their preference for a single codebase.
I think higher resolution displays would definitely be of interest to publishers and using multiple CPUs more efficiently with be a prime issue with the Photoshop market once CS3 is out and Power Macs have four cores.
I guess my real question is whether Leopard's functions and features will be fully supported with CS3 or CS4.
Just thinking out loud here, but Adobe has been prone to almost ignoring advances in MacOS for several generations due to their preference for a single codebase.
I think higher resolution displays would definitely be of interest to publishers and using multiple CPUs more efficiently with be a prime issue with the Photoshop market once CS3 is out and Power Macs have four cores.
I guess my real question is whether Leopard's functions and features will be fully supported with CS3 or CS4.
Right, I get that, but:
1) the scale factor (for resolution independence) is implemented in a way that CS3 should take advantage of it automatically, whether Adobe wants it to or not.
2) OpenGL likely won't be used by CS3 at all, since Adobe has been reluctant so far to use GPU acceleration.
the scale factor (for resolution independence) is implemented in a way that CS3 should take advantage of it automatically, whether Adobe wants it to or not.
Cool. Very cool.
uh....can someone actually tell me what da heck this means? i suppose it's a good thing but I have no idea what it actually does or what it is talking about.
Yup : currently everything assumes 72dpi (dots (aka pixels) per inch). This means that if your window defines its width as 144 dpi, then on a screen that has a resolution of 72dpi your window will be 2 inches wide. If you you increase the DPI, then you reduce the visual size of your window. For example if your screen has 144 pixels per inch, then your window is now only 1 inch wide, when it was intended to be 2 inches wide.
Resolution indepence would define lengths differently. This means instead of indicating your window takes up 144 pixels, you would say it is 2 inches (or cenimetres if you want) wide. This means that no matter the resolution of your screen you window would stay the same size - you could use a ruler and find that your window is two inches wide, whether your screen is 72dpi or 144dpi.
It's always the same old story on Mac/PC boards such as Ars Technica. "MS had Exposé first". This last example is someone talking about the Windows feature that allows one to tile windows on screen without any overlaps. Can you imagine? There are countless examples where people compare shitty implementations to OS X implementations and claim they were the first to do it.
There is a difference between first to do it and first to do it well. I wouldn't be surprised if Tile was in BeOS, X11 or something like that before Windows picked it up. The problem with that is that it doesn't preserve the window in the Tile mode, though you can undo Tile. There is still no reasonably obvious way to put that to a hot key or hot corner, which is what makes it useful to me. I did manage to assign a hot key to the "Show Desktop" icon.
what are you doing in here? new mackbook pros just hit at apple all have firewire 800!!!!!!!
I'm already on a MBP 17".
For example, the More Info button in Tiger's Column View is vector-based, and remains a crisp, shaded oval even when zoomed to higher DPIs with Quartz Debug. (The new iTunes, sadly, shows no sign of this.)
Even though not easily enabled in Tiger, it IS pretty cool, and fun to play with as a glimpse of things to come. (If you don't want to install all of the dev tools, use Pacifist to install just Quartz Debug. And Ars had some screenshots a while back.)
More details are in this year-old page from Apple:
http://developer.apple.com/releaseno...pendentUI.html
what are you doing in here? new mackbook pros just hit at apple all have firewire 800!!!!!!!
I didn't notice that, that's very nice. There was a sense that it might come back. All I really want is an EC/34 interface card so I can slap a FW800 port to an existing machine.