That is what I asked myself too after seeing this Chipset Diagram yesterday night. They put all the stuff there that is not necessary for optimus but still use the slow Nvidia GPUs
My guess: it's about time-to-market. The ATI 5xxx series uses a different driver from previous ATI graphics chips, whereas the nVidia 330 uses the same basic driver as the nVidia 320 did. Tweaking an existing driver takes far less time than writing a new one from scratch (because the existing ATI driver won't work with the new ATI chips). Apple also undoubtedly got a really good deal on the chips as part of the already-existing Macbook 13" nVidia chipset buys.
Will I upgrade to the new MBP? Doubtful. My year-old MBP works great and is plenty fast for what I do. Though I must admit that the Core I7 version is lust-worthy... $2200 worth of lust-worthy. Uhm, no, I think not...
I think it's more that Apple develop and support their own drivers so they can optimize the switching in the best way possible based on what the OS does. I doubt Optimus would improve on the 10 hours Apple have managed.
I think Apple and NVidia have a good partnership going and that shows with NVidia building a custom integrated chip for their 13" MBP. ATI do have better performing GPUs currently but NVidia GPUs get better software support.
They've made a good choice with the 320M and 330M - PC manufacturers are going with the 330M too on the higher end laptops. If we get 320M across the whole low-end, the lineup will be much stronger for developers to start using OpenCL, especially with 2 GPUs available.
Intel's OpenCL is abysmal.
Nvidia is 3rd behind Apple and AMD on OpenCL compliance and performance.
Ars apparently talked to Apple at length today to get info about the graphics switching tech. This truly is more efficient than the other options that came before it.
I think Apple and NVidia have a good partnership going and that shows with NVidia building a custom integrated chip for their 13" MBP. ATI do have better performing GPUs currently but NVidia GPUs get better software support.
They've made a good choice with the 320M and 330M - PC manufacturers are going with the 330M too on the higher end laptops. If we get 320M across the whole low-end, the lineup will be much stronger for developers to start using OpenCL, especially with 2 GPUs available.
I agree that nVidia is generally known for their better drivers, particularly OpenGL drivers, but their recent efforts on Mac haven't been too spectacular. Like issues with the 8800GT being weaker for Core Image acceleration than the much lower-end HD2600XT in the Mac Pro that took several OS updates to fix. Or nVidia GPUs running Call of Duty 4 noticeably worse in OS X compared to Windows whereas ATI GPUs show similar performance between OS, meaning nVidia's OS X drivers are less optimized. Or nVidia drivers in 10.6.0 and 10.6.1 having a bug that caused stuttering in Bioshock which wasn't present with ATI cards requiring waiting for 10.6.2 to fix the problem. I believe nVidia is also behind ATI in OpenGL 3.0 support in OS X, where nVidia DX10 GPUs support 21/23 extensions for OpenGL 3.0 whereas ATI GPUs support 22/23, meaning ATI is only waiting on Apple to implement GLSL 1.30 in their OpenGL front-end and software renderer.
I think Apple developing their own dynamic GPU switching implementation is a sign that they don't want to get too closely tied to nVidia. Although no confirmed, I think Apple's efforts most make sense if they made a vendor agnostic way to do dynamic switching rather than relying on Optimus which is nVidia only. Either Apple made a way to do switching that truly doesn't care about what discrete GPU is used, ie nVidia Optimus capable GPUs, nVidia non-Optimus capable GPUs, and any ATI GPU can be used. So ATI GPUs are already a viable solution. Or Apple made a way that abstracts how a vendor's specific GPU switching implementation interacts with the OS and some degree of hardware support for dynamic GPU switching is required on the GPU itself, in which case Apple won't be supporting ATI mobile GPUs until they have a native hardware answer to Optimus. Apple wanting a GPU vendor agnostic solution makes sense so they don't have to eventually deal with the driver/OS peculiarities of two eventual different ways of doing dynamic switching from ATI and nVidia. I believe this is how Apple does OpenGL too, whereas in Windows Intel, ATI, and nVidia write the whole OpenGL stack, on Mac Apple writes the common front-end allowing greater feature parity and abstraction of hardware implementation of the OpenGL spec while Intel, ATI, and nVidia write the back-end to translate those instructions to their hardware.
Apple choosing nVidia this round is probably largely separate from the GPU dynamic switching issue. The GT330M was probably chosen as others mentioned due to driver maturity, availability, price, and bundling with the 320M.
If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?
If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?
It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".
It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".
This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.
This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.
Mac OS X already had a power efficiency advantage over Windows when comparing similar tasks dual-booting on a Mac. I wonder how much better Apple's graphic switching solution will help power when the GPU is active, considering that the IGP won't be active when the GPU is running in Mac OS X and it won't have the additional overhead of Optimus has since there are no Switchable Graphics Applet or profiles that need to be maintained.
Also, it looks like Optimus' activation is based on apps. To me that means that even if I've launch WoW yet aren't playing a game, but instead using Firefox, the GPU would be running despite not yet needing it. Am I understanding that correctly?
If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?
Quote:
Originally Posted by Marvin
It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".
Quote:
Originally Posted by Aquatic
This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.
Comments
ummm, apple is an innovator, not a copier.
Plus we all know that anything they do is automatically correct.
Hmmmm...
The thought plickens.
that just blew my mind
That is what I asked myself too after seeing this Chipset Diagram yesterday night. They put all the stuff there that is not necessary for optimus but still use the slow Nvidia GPUs
My guess: it's about time-to-market. The ATI 5xxx series uses a different driver from previous ATI graphics chips, whereas the nVidia 330 uses the same basic driver as the nVidia 320 did. Tweaking an existing driver takes far less time than writing a new one from scratch (because the existing ATI driver won't work with the new ATI chips). Apple also undoubtedly got a really good deal on the chips as part of the already-existing Macbook 13" nVidia chipset buys.
Will I upgrade to the new MBP? Doubtful. My year-old MBP works great and is plenty fast for what I do. Though I must admit that the Core I7 version is lust-worthy... $2200 worth of lust-worthy. Uhm, no, I think not...
Maybe NIVIDIA should sue Apple for copying their tech
Apple created the solution to keep Intel and Nvidia out of being in conflict with their current professional relations.
I think it's more that Apple develop and support their own drivers so they can optimize the switching in the best way possible based on what the OS does. I doubt Optimus would improve on the 10 hours Apple have managed.
I think Apple and NVidia have a good partnership going and that shows with NVidia building a custom integrated chip for their 13" MBP. ATI do have better performing GPUs currently but NVidia GPUs get better software support.
They've made a good choice with the 320M and 330M - PC manufacturers are going with the 330M too on the higher end laptops. If we get 320M across the whole low-end, the lineup will be much stronger for developers to start using OpenCL, especially with 2 GPUs available.
Intel's OpenCL is abysmal.
Nvidia is 3rd behind Apple and AMD on OpenCL compliance and performance.
clever
Why, thank you.
I think Apple and NVidia have a good partnership going and that shows with NVidia building a custom integrated chip for their 13" MBP. ATI do have better performing GPUs currently but NVidia GPUs get better software support.
They've made a good choice with the 320M and 330M - PC manufacturers are going with the 330M too on the higher end laptops. If we get 320M across the whole low-end, the lineup will be much stronger for developers to start using OpenCL, especially with 2 GPUs available.
I agree that nVidia is generally known for their better drivers, particularly OpenGL drivers, but their recent efforts on Mac haven't been too spectacular. Like issues with the 8800GT being weaker for Core Image acceleration than the much lower-end HD2600XT in the Mac Pro that took several OS updates to fix. Or nVidia GPUs running Call of Duty 4 noticeably worse in OS X compared to Windows whereas ATI GPUs show similar performance between OS, meaning nVidia's OS X drivers are less optimized. Or nVidia drivers in 10.6.0 and 10.6.1 having a bug that caused stuttering in Bioshock which wasn't present with ATI cards requiring waiting for 10.6.2 to fix the problem. I believe nVidia is also behind ATI in OpenGL 3.0 support in OS X, where nVidia DX10 GPUs support 21/23 extensions for OpenGL 3.0 whereas ATI GPUs support 22/23, meaning ATI is only waiting on Apple to implement GLSL 1.30 in their OpenGL front-end and software renderer.
I think Apple developing their own dynamic GPU switching implementation is a sign that they don't want to get too closely tied to nVidia. Although no confirmed, I think Apple's efforts most make sense if they made a vendor agnostic way to do dynamic switching rather than relying on Optimus which is nVidia only. Either Apple made a way to do switching that truly doesn't care about what discrete GPU is used, ie nVidia Optimus capable GPUs, nVidia non-Optimus capable GPUs, and any ATI GPU can be used. So ATI GPUs are already a viable solution. Or Apple made a way that abstracts how a vendor's specific GPU switching implementation interacts with the OS and some degree of hardware support for dynamic GPU switching is required on the GPU itself, in which case Apple won't be supporting ATI mobile GPUs until they have a native hardware answer to Optimus. Apple wanting a GPU vendor agnostic solution makes sense so they don't have to eventually deal with the driver/OS peculiarities of two eventual different ways of doing dynamic switching from ATI and nVidia. I believe this is how Apple does OpenGL too, whereas in Windows Intel, ATI, and nVidia write the whole OpenGL stack, on Mac Apple writes the common front-end allowing greater feature parity and abstraction of hardware implementation of the OpenGL spec while Intel, ATI, and nVidia write the back-end to translate those instructions to their hardware.
Apple choosing nVidia this round is probably largely separate from the GPU dynamic switching issue. The GT330M was probably chosen as others mentioned due to driver maturity, availability, price, and bundling with the 320M.
If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?
It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".
It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".
This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.
This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.
Mac OS X already had a power efficiency advantage over Windows when comparing similar tasks dual-booting on a Mac. I wonder how much better Apple's graphic switching solution will help power when the GPU is active, considering that the IGP won't be active when the GPU is running in Mac OS X and it won't have the additional overhead of Optimus has since there are no Switchable Graphics Applet or profiles that need to be maintained.
Also, it looks like Optimus' activation is based on apps. To me that means that even if I've launch WoW yet aren't playing a game, but instead using Firefox, the GPU would be running despite not yet needing it. Am I understanding that correctly?
Here is the whitesheet on it...
If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?
It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".
This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.