Inside Mac OS X Snow Leopard: GPU Optimization

1246

Comments

  • Reply 61 of 102
    Apple definitely has been in the lead regarding advanced, graphics accelerated user interfaces and desktop compositing, but this article neglects to mention some significant issues with Apple's history with graphics hardware.



    1) Although it has definitely mproved in the past 18-24 months, Historically Apple's computers have always had a very poor choice of graphics hardware, and even more significantly, the GPUs that were available were almost always weak, lower performing cards (with a significant price premium).



    2) The author brings up budget 'netbooks' and their slow-as-molasses Intel GMA950 graphics chipsets, but neglects to mention that the $1000+ Macbook from just a year or two ago was using this very chipset!!!!



    3) GPUs have offered hardware accelerated video decoding functionality for many years, with full support for this in multiple versions of Windows. Only now, in mid-2009 with the release of OSX 10.6 has the Mac finally caught up with the introduction of Quicktime X.
  • Reply 62 of 102
    Quote:
    Originally Posted by stuffe View Post


    There's not a lot of meat in this article, compared to the others in the series. I don;t care what Windows did in the past, or what Windows Gamers still use, I want to know about SL, and despite the strapline, this article only talks about openGL, not OCL or GCD.



    Exactly what I thought. I appreciate the writing efforts but the iPhone stuff was a bit tangential and the "subtle use of GPU something" part towards the end was quite vague.



    I would like to know how my 9400M in my Macbook Alu. is being used in a tangible way.



    I know when decoding H.264 I find CPU use at only 20%-30%?



    A bit lost here...
  • Reply 63 of 102
    I think you are mistaken about the 9400M's power. On a gaming basis of course it is considered weak. Also because in gaming you do need dedicated VRAM because as I understand you need that to store all the textures which are essentially bitmaps, so you need to store a ton of bitmaps in VRAM.



    But in terms of GPGPU, for example Folding@Home, we're talking crunching numbers multiples faster than something like an Intel 2.5GHZ Core2 CPU.



    In GPGPU, I am not sure how or if VRAM causes significant bottlenecks.



    I think even in the 9400M there is some real computing power that is untapped. Folding@Home and Badaboom are just a few small examples of what is really possible. When GPU and CPU are both in full force, utilised well with multithreaded/etc. code, things can really move forward. One of the pertinent points of the article is that the Windows paradigm and everyone stuck on 32bit XP for the most part, really hides the true nature of the computing power today's hardware is capable of.



    The 9400M has the potential to be 5x faster (at a very rough estimate, depending on software etc, details of which unfortunately I am not able to quote right now) at H.264 encoding than above mentioned 2.5GHZ Core2 CPU. This is nothing to sneeze at, I would say.



    Quote:
    Originally Posted by cg0def View Post


    First of all let me say I don't really care about 9400M's performance. The GPU is weak and I believe Apple puts way too much attention on it. Yes it replaces a GPU that is several folds weaker but still if you want real computing power you would be looking at dedicated GPUs with dedicated high speed VRAM. Anyway here are answers to some of your questions. OpenCL does not make flash any faster since there isn't a single OpenCL instruction in Flash. I would even go so far as to say that GPU acceleration in the Mac version of Flash is only present in the configuration GUI but that's not what Adobe's been saying ... Anyway to me the CPU usage is more than a proof.

    With plugins such as ClickToFlash you can however see how much better things can be. I have a 2.4 GHz early 2008 MBP ( 8600gt 256mb ). The flash version of YouTube videos use about 50-60% of one of the cores and that's on a good day. At the same time the h.264 version (using the opengl version of QT ) uses about 15-16%. Not that I feel either one of them ( in any other way than fan noise ) but the lower CPU usage results in longer battery life.



  • Reply 64 of 102
    Quote:
    Originally Posted by azcodemonkey View Post


    @Snafu - I don't recall DirectX ever having facilities to use the GPU in the same manner as OpenCL/CUDA.



    Sorry: what I meant to say is that Microsoft declared its intention to put some way to exploit the GPU in a GPGPU manner through DirectX time ago, way before Apple began to talk about OpenCL. I didn't mean that it is already implemented.



    Quote:
    Originally Posted by anonymouse View Post


    OpenCL is "baked in" in any meaningful sense of the phrase. It's a standard and official piece of technology and set of APIs that anyone programming for SL can depend on being present. And, as others have pointed out, other APIs, such as Core Image, now depend on it, so "removing it", were that possible, would disable large amounts of other system functionality. You can't really get much more "baked in" than that.



    Given how Apple is implementing such "core" libraries (having a JIT compiler decide which present resources, CPU or GPU, are better for a given job), and that OpenCL, for now, is useful for a limited set of GPUs, my guess is Core Image can go by without OpenCL if needed (that, or SL has OpenCL drivers for the CPUs too, as intended by design).
  • Reply 65 of 102
    Quote:
    Originally Posted by Poppleganger View Post


    what the fuck is a jingle pundit



    Yeah. I've lived in Australia and the USA and spent a few months in the UK and I have never heard that.
  • Reply 66 of 102
    Quote:
    Originally Posted by brucep View Post


    Apple's Support for the gaming world is moving ahead at light speed . We will over take the p/c realm very soon.



    So all you x-box types are wallowing in old mud like systems with little advancement . Apple with be the gaming platform leader by 2015 . I await all the halo's on mac one day .



    As per your location I think you have inhaled too much Methane.
  • Reply 67 of 102
    Quote:
    Originally Posted by ascii View Post


    I have noticed one good think about Snow Leopard this evening. On Leopard if I tried to watch a video in iTunes while encoding with Handbrake in the background, iTunes would stutter. On Snow Leopard it doesn't. It seems better at balancing resources amongst programs.



    VMWare Fusion 2.0.5 running XP seems to be smoother when switching between it and Adobe CS4 apps, Safari, Mail etc. Color me impressed.
  • Reply 68 of 102
    Quote:
    Originally Posted by Snafu View Post


    Given how Apple is implementing such "core" libraries (having a JIT compiler decide which present resources, CPU or GPU, are better for a given job), and that OpenCL, for now, is useful for a limited set of GPUs, my guess is Core Image can go by without OpenCL if needed (that, or SL has OpenCL drivers for the CPUs too, as intended by design).



    OpenCL's use of JIT technology and Core Image's dependency on OpenCL are separate issues. The whole point of OpenCL is that it will use whatever it finds (that is supported) and that its clients, like Core Image, don't have to worry about that. I doubt very much that Apple implemented the use of OpenCL in Core Image so that Core Image checks to see if it will make a difference and only uses OpenCL if it will. Clearly, the correct way to do this is to just use OpenCL and let it figure it out.
  • Reply 69 of 102
    mariomario Posts: 348member
    My quick testing of Snow Leopard (with 64 bit kernel) shows that quartz performance in Snow Leopard is about 90-95% of Leopard's score, and Open GL performance is about twice as worse.



    Apparently other reviewers are noticing the same, and Apple is aware of the issue.



    On the other hand everything else is faster, from CPU, Memory, threading performance, compression etc.



    RAW image processing in Nikon Capture NX is noticably faster. Compressing and creating image of my Leopard install (which is 70 GB in size) took only 13 min in Snow Leopard with 64 bit kernel.
  • Reply 70 of 102
    Quote:
    Originally Posted by Mario View Post


    My quick testing of Snow Leopard (with 64 bit kernel) shows that quartz performance in Snow Leopard is about 90-95% of Leopard's score, and Open GL performance is about twice as worse.



    What test did you use? I'd like to try it as well.
  • Reply 71 of 102
    According to the monthly hardware data report published by Valve's Steam covering roughly a million PC users, even among members of the gaming community, most of whom have 512MB or more of VRAM and a better GPU than most Mac users and, the majority are still running Windows XP, on operating system which offers no GPU-accelerated user interface. Similarly, only 17% of this high end gamer demographic is running a 64-bit edition of Windows, despite being a population that benefits the most from 64-bit addressing.
    This supports my longstanding argument that gamers' embrace of the PC platform has never been about the quality of the experience, but rather about cheap hardware. Gamers are mostly kids and dysfunctional adults, both on limited budgets. If they could afford high-end graphics workstations, they'd be using high-end graphics workstations. Apple is right to ignore this market, there is little serious money to be made from "gamers."



    Oh, and before all you casual gamers, start spamming me. I'm not talking about you, and you know it. If you are on this site, reading this, you clearly have a life, even if you do occasionally enjoy a game. It is the obsessive gamers, the fanatics that are most radical and fanatical about advocating for the Wintel platform.
  • Reply 72 of 102
    Quote:
    Originally Posted by ascii View Post


    Yes, good point. Windows 7 is being treated like the second coming, but if you stop and think, how many new features does it have really? This is one rare instance when MS marketing has beaten Apple. Very similar product but completely different perceptions.



    "Very similar?"



    Not even close...



    http://en.wikipedia.org/wiki/Features_new_to_Windows_7
  • Reply 73 of 102
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by TheSnarkmeister View Post


    This supports my longstanding argument that gamers' embrace of the PC platform has never been about the quality of the experience, but rather about cheap hardware. Gamers are mostly kids and dysfunctional adults, both on limited budgets. If they could afford high-end graphics workstations, they'd be using high-end graphics workstations. Apple is right to ignore this market, there is little serious money to be made from "gamers."



    Oh, and before all you casual gamers, start spamming me. I'm not talking about you, and you know it. If you are on this site, reading this, you clearly have a life, even if you do occasionally enjoy a game. It is the obsessive gamers, the fanatics that are most radical and fanatical about advocating for the Wintel platform.



    Prejudiced much?
  • Reply 74 of 102
    Quote:
    Originally Posted by Panu View Post


    Steve Ballmer said that Windows 7 is Vista done right. So if Snow Leopard is a service pack, then Windows 7 is an apology. And you have to pay for that?



    Vista works perfectly fine for me. And to anyone who thinks Snow Leopard is a service pack. You are wrong. A service pack is a bunch of bug fixes and security updates in one package. Just because Snow Leopard doesn't have a brand new interface does not mean it is not a new operating system.
  • Reply 75 of 102
    Quote:
    Originally Posted by BrandonLive View Post


    "Very similar?"



    Not even close...



    http://en.wikipedia.org/wiki/Features_new_to_Windows_7



    You sure about that?



    http://arstechnica.com/apple/reviews...os-x-10-6.ars/



    Well you're right, they're not the same. One is improving upon an already solid OS that set the bar, while the other is "improving" or rather, fixing, a failure (that even Billy G. tried his best to distance himself from.)



    In any case, it's pretty clear that your employers want everyone to just forget Vista ever existed and treat Windows 7 like it's The Second Coming, even though Windows users haven't had a decent OS since 2001. And XP was really nothing to be proud of. In this case I certainly hope Windows 7 will be dramatic. Your company needs it to be. Badly. A few more awkward and uncomfortable press conferences like the one just past (room full of Macs and all), and your boss Ballmer's head will explode, never mind flying chairs.
  • Reply 76 of 102
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by bobertoq View Post


    Vista works perfectly fine for me.



    I guess it works well for someone.



    I bought a computer with Vista last Sunday, I hadn't gotten around to using it very much yet. Right now, my biggest gripes are that some icons are hard to distinguish from each other, and an updater that doesn't really try to catch all the necessary updates in the first pass. I've run Windows Update about six times (not hyperbole) and every time it shows me something new that it couldn't have been bothered to give me the previous five times. Before the service pack, every activation of the UAC meant that the screen turned off for a second and then came back and gave me the dialogue asking whether I wanted to run a program. UAC asks me if I really want to run Windows Update, and asked me if I really wanted to install the updates. Goofy.
  • Reply 77 of 102
    Quote:
    Originally Posted by JeffDM View Post


    UAC



    UAC is a complete nightmare, in all ways, including the nag you get periodically telling you that it's off: of course it's off, how could you not turn it off.
  • Reply 78 of 102
    Quote:
    Originally Posted by anonymouse View Post


    UAC is a complete nightmare, in all ways, including the nag you get periodically telling you that it's off: of course it's off, how could you not turn it off.



    Linux and OS X have had permissions implemented for years - long before Vista. Except that it's implemented properly on these operating systems.



    The situation has improved with Windows 7, apparently.
  • Reply 79 of 102
    Quote:
    Originally Posted by TheSnarkmeister View Post


    This supports my longstanding argument that gamers' embrace of the PC platform has never been about the quality of the experience, but rather about cheap hardware. Gamers are mostly kids and dysfunctional adults, both on limited budgets. If they could afford high-end graphics workstations, they'd be using high-end graphics workstations. Apple is right to ignore this market, there is little serious money to be made from "gamers."



    Oh, and before all you casual gamers, start spamming me. I'm not talking about you, and you know it. If you are on this site, reading this, you clearly have a life, even if you do occasionally enjoy a game. It is the obsessive gamers, the fanatics that are most radical and fanatical about advocating for the Wintel platform.



    I would be highly offended if not for your last paragraph. You're wrong about most gamers going to the cheap hardware. Look, in PC gaming, a "gaming rig" costs the same or just a bit more than a off-the-shelf midrange Dell. The difference is you choose the right parts so you get bang-for-buck. For example, quadcore is overkill, take that money and put it towards the GPU. Because a midrange Dell has an absolutely sh*t GPU usually, they bamboozle you with "the latest and greatest Intel CPU OMFG AWESOME".



    As someone pointed out before, a 9600GT is probably midrange or low-range now in the mainstream PC gaming world... This GPU is distinctly ahead of most offerings in the latest and greatest iMacs and 13" Macbook "Pro" from Apple.



    While fanatics for the Wintel platform exist, I think the real obsessive gamers go to Xbox360. That is showing a far more intense gaming culture than Wintel. Most PC gamers now complain the PC games we get are usually console ports. Can you believe if you have an ATI card the latest Batman game DOESN'T EVEN HAVE ANTI-ALIASING? Like Dead Space, also another console port given a second thought for the PC... Great game though, still.



    Quote:
    Originally Posted by JeffDM View Post


    I guess it works well for someone.



    I bought a computer with Vista last Sunday, I hadn't gotten around to using it very much yet. Right now, my biggest gripes are that some icons are hard to distinguish from each other, and an updater that doesn't really try to catch all the necessary updates in the first pass. I've run Windows Update about six times (not hyperbole) and every time it shows me something new that it couldn't have been bothered to give me the previous five times. Before the service pack, every activation of the UAC meant that the screen turned off for a second and then came back and gave me the dialogue asking whether I wanted to run a program. UAC asks me if I really want to run Windows Update, and asked me if I really wanted to install the updates. Goofy.





    Vista with Service Pack helps, but yeah, still can be Goofy sometimes. My DawnofWar: Soulstorm crashes every 2 hours. Sure, my CPU and GPU is overclocked, but Left4Dead, Dead Space, Trine all handle it fine.



    Running Vista 64bit (64bit kernel, though most apps are 32bit) Ultimate.
  • Reply 80 of 102
    Quote:
    Originally Posted by anonymouse View Post


    UAC is a complete nightmare, in all ways, including the nag you get periodically telling you that it's off: of course it's off, how could you not turn it off.



    Quote:
    Originally Posted by Quadra 610 View Post


    Linux and OS X have had permissions implemented for years - long before Vista. Except that it's implemented properly on these operating systems.



    The situation has improved with Windows 7, apparently.



    UAC in Windows 7 is improved in the sense that you can more easily tell it to shut the f*** up. That's mostly it, AFAIK.
Sign In or Register to comment.