or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Nvidia says new MacBook Pro graphics switching isn't Optimus
New Posts  All Forums:Forum Nav:

Nvidia says new MacBook Pro graphics switching isn't Optimus

post #1 of 37
Thread Starter 
The automatic graphic switching capabilities in the new 15- and 17-inch MacBook Pros are accomplished with a solution created entirely by Apple, and do not rely on Nvidia's established Optimus technology.

An Nvidia spokesperson confirmed to AppleInsider Tuesday that Apple's new high-end MacBook Pros include an automatic graphics switching solution that is the Mac maker's own creation. Nvidia had no input on Apple's solution and would not comment on the utilized technology.

Earlier this year, Nvidia introduced a new technology called Optimus, which is designed two work alongside Nehalem notebook designs -- like the Core i5 and Core i7 -- that include Intel's integrated graphics processor, as well as a discrete Nvidia graphics chip. The feature chooses the best of the two processors for running a given application. While the end result is similar, Apple's automatic graphics switching solution is not Optimus, Nvidia said.

This switching is accomplished on-the-fly with no input from users. Apple also offers users the option to switch solely to discrete graphics and turn off the automatic switching.

The top-tier MacBook Pros include the Nvidia GeForce 330M graphics processor, which is more than twice as fast as the low-end 320M found exclusively in the new 13-inch model. The 330M, however, is not an exclusive GPU, as it has been found in competing PCs that have already come to market.

Apple on Tuesday introduced its new line of MacBook Pros, with the 15- and 17-inch models sporting the proprietary graphics switching capabilities. The feature dynamically switches between the 330M for peak performance, and the integrated Intel HD Graphics for more energy efficient operation. Apple said the tightly integrated software and hardware solution allows battery life of 8 to 9 hours on the new MacBook Pros.



It was in October 2008 that Apple jettisoned Intel's supporting chipsets from its MacBook line, and opted instead for the better performance offered by Nvidia's GPUs. MacBook Pros received similar treatment, with the addition of a secondary, more powerful Nvidia GeForce 9600M GT discrete graphics processor for higher performance operations.

Apple had to rely on graphics switching capabilities in its new notebooks introduced Tuesday because the new Arrandale processors feature the major northbridge chipset memory controller components built in. The architectural changes through Arrandale and an ongoing lawsuit that has forced Nvidia to halt the development of future chipsets have required PC manufacturers like Apple to rely on proprietary Intel chipsets and their integrated graphics processors. Apple's automated switching solution provides use of both the Intel integrated graphics processing power, as well as Nvidia's 330M GPU.

Apple has touted that the 330M is the "fastest graphics ever" found in a Mac notebook. The high-end discrete graphics processor is available only in the 15- and 17-inch MacBook Pro models.

"With 48 processing cores and up to 512MB of dedicated video memory, this graphics processor delivers even more horsepower than the previous generation," the company said. "And you dont have to sacrifice efficiency for speed: The NVIDIA GeForce GT 330M is up to 30 percent more energy efficient than its predecessor. For even greater power savings, MacBook Pro also includes integrated Intel HD Graphics."
post #2 of 37
Maybe NIVIDIA should sue Apple for copying their tech
post #3 of 37
Quote:
Originally Posted by Mazda 3s View Post

Maybe NIVIDIA should sue Apple for copying their tech

ummm, apple is an innovator, not a copier.

iPad2 16 GB
iPhone 5 32 GB

Reply

iPad2 16 GB
iPhone 5 32 GB

Reply
post #4 of 37
I'll guess it's wholly Apple's design because Apple has the rights from Intel to do it, whereas NVIDIA does not.
post #5 of 37
sweet, you can turn off the switching if you want. I figured they wouldn't give us the option :P
post #6 of 37
Quote:
Originally Posted by Vertical View Post

sweet, you can turn off the switching if you want. I figured they wouldn't give us the option :P

Yeah that's pretty sweet. I also didn't expect that.
post #7 of 37
So why did they go with Nvidia, assuming they could have done the same thing with the ATi 5XXX series? Perhaps better drivers (damn, answered my own question). But will you be able to dynamically switch without logging off or restarting when running Windows in Boot Camp?
post #8 of 37
Quote:
Originally Posted by AppleInsider View Post

Apple also offers users the option to switch solely to discrete graphics and turn off the automatic switching.

Can we get a confirmation that you can ONLY switch solely to discrete graphics and not switch solely to integrated graphics?

Seems like this should be the other way around since I see no reason for the system to drop back to the IGP if it can handle it.

Plus, are there settings for this based on the power source in use?
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #9 of 37
Hmmmm...

The thought plickens.
Pity the agnostic dyslectic. They spend all their time contemplating the existence of dog.
Reply
Pity the agnostic dyslectic. They spend all their time contemplating the existence of dog.
Reply
post #10 of 37
and yet Mac OS X 10.6.3 still doesn't have full OpenGL 3.0 drivers. Whats the point of having a shit hot graphics card that has OpenGL 4.0 capabilities but is gimped by drivers as is the 9600M.
post #11 of 37
Any word yet whether Optimus technology will be available when running Windows via Boot Camp on the MBP?
post #12 of 37
Quote:
Originally Posted by Ginja View Post

So why did they go with Nvidia, assuming they could have done the same thing with the ATi 5XXX series? Perhaps better drivers (damn, answered my own question). But will you be able to dynamically switch without logging off or restarting when running Windows in Boot Camp?

Exactly. If Apple managed to figure out dynamic GPU switching on their own and it isn't nVidia Optimus, I wonder what is the reason to stick with nVidia's slower and more inefficient GPUs? The GT330M has a TDP of 23W, while the Mobility HD5650 is faster and yet uses less power/produces less heat with a TDP of 15-19W. The Mobility HD5750 is faster still with significantly more bandwidth due to GDDR5 and has a TDP of 25W. The GT330M may be the fastest GPU in Mac notebooks, but clearly Apple could have done a lot better if they wanted to.
post #13 of 37
Quote:
Originally Posted by solofest View Post

Any word yet whether Optimus technology will be available when running Windows via Boot Camp on the MBP?

That is a good question.

This might have been, in part, to hobble Windows on Macs, but that moves has pros and con.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #14 of 37
That's because it's actually . . . Megatron.

RUN!!
post #15 of 37
Quote:
Originally Posted by justflybob View Post

Hmmmm...

The thought plickens.



clever
post #16 of 37
Quote:
Originally Posted by Ginja View Post

So why did they go with Nvidia, assuming they could have done the same thing with the ATi 5XXX series? Perhaps better drivers (damn, answered my own question). But will you be able to dynamically switch without logging off or restarting when running Windows in Boot Camp?


Maybe Nvidia just gave the code or some implementation details to Apple to call their own to avoid any further licensing issues between Intel and Nvidia? And also seal the GPU deal.
post #17 of 37
Quote:
Originally Posted by Ginja View Post

So why did they go with Nvidia, assuming they could have done the same thing with the ATi 5XXX series? Perhaps better drivers (damn, answered my own question). But will you be able to dynamically switch without logging off or restarting when running Windows in Boot Camp?

That is what I asked myself too after seeing this Chipset Diagram yesterday night. They put all the stuff there that is not necessary for optimus but still use the slow Nvidia GPUs. A 5830 has a TDP only 1 W higher than the 330M and has an almost twice as high 3D Mark Vantage Score. A 5650 is less hungry and still faster than a 330M and the way auto switching seems to be implemented they could as easily used the ATI.
The only reason I can think of why they would prefer Nvidia is
1. they got the same on all MBP 13" - 17".
2. Nvidia's better OpenCL performance. Doubtful since they use ATI on iMac.

I intended to get a 15" since I can use the bigger screen but now I think I just get the cheaper 13". This way I don't spend this insanely much money for a weird 15" MBP and can still switch if they update them again. Later they might have finally a serious GPU upgrade with not only 256mb RAM and also LightPeak also SandyBridge is better too although I don't care as much about CPU Power as about everything else.
10hours Battery life also sounds great if it is true.
post #18 of 37
Quote:
Originally Posted by dusk View Post

Doubtful since they use ATI on iMac.

It might still be a driver issue; the iMac uses the 4XXX series cards. Also, when they do manage to release some drivers for the 5XXX series, I'll be straight out to buy a 5850 to update the aging 8800GTS in my hackintosh, on the assumption that it won't be long until the OSX86 community get them working!
post #19 of 37
Quote:
Originally Posted by Foo2 View Post

I'll guess it's wholly Apple's design because Apple has the rights from Intel to do it, whereas NVIDIA does not.

I think it's more that Apple develop and support their own drivers so they can optimize the switching in the best way possible based on what the OS does. I doubt Optimus would improve on the 10 hours Apple have managed.

Quote:
Originally Posted by ltcommander.data

If Apple managed to figure out dynamic GPU switching on their own and it isn't nVidia Optimus, I wonder what is the reason to stick with nVidia's slower and more inefficient GPUs?

I think Apple and NVidia have a good partnership going and that shows with NVidia building a custom integrated chip for their 13" MBP. ATI do have better performing GPUs currently but NVidia GPUs get better software support.

They've made a good choice with the 320M and 330M - PC manufacturers are going with the 330M too on the higher end laptops. If we get 320M across the whole low-end, the lineup will be much stronger for developers to start using OpenCL, especially with 2 GPUs available.
post #20 of 37
It's better in the long run for Apple in case they decides to go with ATI or other company.
post #21 of 37
Quote:
Originally Posted by Mazda 3s View Post

Maybe NIVIDIA should sue Apple for copying their tech

Apple created an optimus-like technology purely to integrate with Nvidia chips. Why would Nvidia want less business? My guess is that Apple is doing this in software instead of hardware so that it can take advantage extra information known only by the operating system.
post #22 of 37
Quote:
Originally Posted by freddych View Post

ummm, apple is an innovator, not a copier.

Plus we all know that anything they do is automatically correct.
post #23 of 37
Quote:
Originally Posted by justflybob View Post

Hmmmm...

The thought plickens.

that just blew my mind
post #24 of 37
Hey.... If it works - It Works
"Why iPhone"... Hmmm?
Reply
"Why iPhone"... Hmmm?
Reply
post #25 of 37
Quote:
Originally Posted by dusk View Post

That is what I asked myself too after seeing this Chipset Diagram yesterday night. They put all the stuff there that is not necessary for optimus but still use the slow Nvidia GPUs

My guess: it's about time-to-market. The ATI 5xxx series uses a different driver from previous ATI graphics chips, whereas the nVidia 330 uses the same basic driver as the nVidia 320 did. Tweaking an existing driver takes far less time than writing a new one from scratch (because the existing ATI driver won't work with the new ATI chips). Apple also undoubtedly got a really good deal on the chips as part of the already-existing Macbook 13" nVidia chipset buys.

Will I upgrade to the new MBP? Doubtful. My year-old MBP works great and is plenty fast for what I do. Though I must admit that the Core I7 version is lust-worthy... $2200 worth of lust-worthy. Uhm, no, I think not...
post #26 of 37
Well at least thats one more thing apple isn't relying on
post #27 of 37
Quote:
Originally Posted by Mazda 3s View Post

Maybe NIVIDIA should sue Apple for copying their tech

Apple created the solution to keep Intel and Nvidia out of being in conflict with their current professional relations.
post #28 of 37
Quote:
Originally Posted by Marvin View Post

I think it's more that Apple develop and support their own drivers so they can optimize the switching in the best way possible based on what the OS does. I doubt Optimus would improve on the 10 hours Apple have managed.



I think Apple and NVidia have a good partnership going and that shows with NVidia building a custom integrated chip for their 13" MBP. ATI do have better performing GPUs currently but NVidia GPUs get better software support.

They've made a good choice with the 320M and 330M - PC manufacturers are going with the 330M too on the higher end laptops. If we get 320M across the whole low-end, the lineup will be much stronger for developers to start using OpenCL, especially with 2 GPUs available.

Intel's OpenCL is abysmal.

Nvidia is 3rd behind Apple and AMD on OpenCL compliance and performance.
post #29 of 37
Quote:
Originally Posted by Quadra 610 View Post



clever

Why, thank you.
Pity the agnostic dyslectic. They spend all their time contemplating the existence of dog.
Reply
Pity the agnostic dyslectic. They spend all their time contemplating the existence of dog.
Reply
post #30 of 37
Ars apparently talked to Apple at length today to get info about the graphics switching tech. This truly is more efficient than the other options that came before it.

http://arstechnica.com/apple/news/20...-switching.ars I'm not surprised Apple did this best since they design both the OS and HW, but that it took so long for them to implement this.
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #31 of 37
This is a classic example of Apple saying, "We can do this better than anyone else" and then going out and doing it.
post #32 of 37
Quote:
Originally Posted by Marvin View Post

I think Apple and NVidia have a good partnership going and that shows with NVidia building a custom integrated chip for their 13" MBP. ATI do have better performing GPUs currently but NVidia GPUs get better software support.

They've made a good choice with the 320M and 330M - PC manufacturers are going with the 330M too on the higher end laptops. If we get 320M across the whole low-end, the lineup will be much stronger for developers to start using OpenCL, especially with 2 GPUs available.

I agree that nVidia is generally known for their better drivers, particularly OpenGL drivers, but their recent efforts on Mac haven't been too spectacular. Like issues with the 8800GT being weaker for Core Image acceleration than the much lower-end HD2600XT in the Mac Pro that took several OS updates to fix. Or nVidia GPUs running Call of Duty 4 noticeably worse in OS X compared to Windows whereas ATI GPUs show similar performance between OS, meaning nVidia's OS X drivers are less optimized. Or nVidia drivers in 10.6.0 and 10.6.1 having a bug that caused stuttering in Bioshock which wasn't present with ATI cards requiring waiting for 10.6.2 to fix the problem. I believe nVidia is also behind ATI in OpenGL 3.0 support in OS X, where nVidia DX10 GPUs support 21/23 extensions for OpenGL 3.0 whereas ATI GPUs support 22/23, meaning ATI is only waiting on Apple to implement GLSL 1.30 in their OpenGL front-end and software renderer.

I think Apple developing their own dynamic GPU switching implementation is a sign that they don't want to get too closely tied to nVidia. Although no confirmed, I think Apple's efforts most make sense if they made a vendor agnostic way to do dynamic switching rather than relying on Optimus which is nVidia only. Either Apple made a way to do switching that truly doesn't care about what discrete GPU is used, ie nVidia Optimus capable GPUs, nVidia non-Optimus capable GPUs, and any ATI GPU can be used. So ATI GPUs are already a viable solution. Or Apple made a way that abstracts how a vendor's specific GPU switching implementation interacts with the OS and some degree of hardware support for dynamic GPU switching is required on the GPU itself, in which case Apple won't be supporting ATI mobile GPUs until they have a native hardware answer to Optimus. Apple wanting a GPU vendor agnostic solution makes sense so they don't have to eventually deal with the driver/OS peculiarities of two eventual different ways of doing dynamic switching from ATI and nVidia. I believe this is how Apple does OpenGL too, whereas in Windows Intel, ATI, and nVidia write the whole OpenGL stack, on Mac Apple writes the common front-end allowing greater feature parity and abstraction of hardware implementation of the OpenGL spec while Intel, ATI, and nVidia write the back-end to translate those instructions to their hardware.

Apple choosing nVidia this round is probably largely separate from the GPU dynamic switching issue. The GT330M was probably chosen as others mentioned due to driver maturity, availability, price, and bundling with the 320M.
post #33 of 37
If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?
post #34 of 37
Quote:
Originally Posted by WindyCityGuy View Post

If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?

It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".
post #35 of 37
Quote:
Originally Posted by Marvin View Post

It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".

This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.
"Overpopulation and climate change are serious shit." Gilsch
"I was really curious how they had managed such fine granularity of alienation." addabox
Reply
"Overpopulation and climate change are serious shit." Gilsch
"I was really curious how they had managed such fine granularity of alienation." addabox
Reply
post #36 of 37
Quote:
Originally Posted by Aquatic View Post

This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.

Mac OS X already had a power efficiency advantage over Windows when comparing similar tasks dual-booting on a Mac. I wonder how much better Apple's graphic switching solution will help power when the GPU is active, considering that the IGP won't be active when the GPU is running in Mac OS X and it won't have the additional overhead of Optimus has since there are no Switchable Graphics Applet or profiles that need to be maintained.

Also, it looks like Optimus' activation is based on apps. To me that means that even if I've launch WoW yet aren't playing a game, but instead using Firefox, the GPU would be running despite not yet needing it. Am I understanding that correctly?

Here is the whitesheet on it...
http://www.nvidia.com/object/LO_opti...itepapers.html (PDF, 1.6MB)
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
post #37 of 37
Quote:
Originally Posted by WindyCityGuy View Post

If the graphics switching is a function of OSX and the nVidia card what can we expect with Windows using either Parallels or Bootcamp? Will graphics switching work with Windows?

Quote:
Originally Posted by Marvin View Post

It should be built into the driver so I'd expect a Windows driver to use Optimus. If it doesn't do the switching, you'll just get the dedicated chip all the time in the 15" and the integrated chip will probably just consume for power on the 13".

Quote:
Originally Posted by Aquatic View Post

This was a good question, I was wondering the same thing, I got the 15" MBP. So I can use Optimus in Win 7 because NVIDIAs drivers enable it, when I'm in Bootcamp? If so that's sweet.




So does AnandTech, unfortunately...

http://www.anandtech.com/show/3659/a...he-one-to-get/
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
Dick Applebaum on whether the iPad is a personal computer: "BTW, I am posting this from my iPad pc while sitting on the throne... personal enough for you?"
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Nvidia says new MacBook Pro graphics switching isn't Optimus