Apple switches from AMD to NVIDIA with next-generation MacBook Pro

Posted:
in Current Mac Hardware edited January 2014
With the announcement of its next-generation MacBook Pro at WWDC on Monday, Apple showed its intent to move away from discrete AMD Radeon video cards to new Kepler-based silicon from NVIDIA.

While Apple's entire laptop lineup saw minor refreshes on Monday, the company's 15-inch MacBook Pros received a significant across-the-line graphics upgrade to NVIDIA's newest GeForce GT 650M GPU, proving May rumors of a supplier switch to be true.

The implementation of the new card is unlike Apple's traditional tiered offerings which gave customers the option of choosing between a number of different GPU models. As of Monday's refresh the only available card is the GeForce GT 650M with memory configurations being dependent on which CPU is selected.

For example, the carry-over 15-inch MacBook Pro with an upgraded 2.3GHz Ivy Bridge processor comes with a non-configurable GT650M with 512MB of memory while the same model with a 2.6GHz CPU is fitted with a GT650M with 1GB of addressable RAM. Both versions of the top-of-the-line "next generation" MacBook Pro include the latter configuration as the standard outfit most likely because of the incredible amount of power needed to drive the Retina Display.

NVIDIA's new notebook GPU, based on next-generation Kepler architecture, is highly efficient and can power the 5,184,000 pixels in the MacBook's 2,880 by 1,800 pixel Retina Display without forcing users to plug in when in discrete graphics mode.

GeForce GT 650M
NVIDIA's GeForce GT 650M notebook GPU is set to power Apple's 15-inch MacBook Pro lineup. | Source: NVIDIA


When Apple launched its first Intel-based MacBook in 2006 the unit included a graphics card made by ATI, the company responsible for a majority video cards used in Apple's legacy PowerPC machines.

The Ontario-based company was purchased by Intel rival AMD by the end of 2006 and subsequent Macs were using NVIDIA cards by the next year. AMD ultimately killed off the ATI moniker and now markets the GPUs under the Radeon name.

Apple went on to use integrated Intel designs as well as NVIDIA's own chipset in 2008, but returned to the Radeon series in early 2011.

Interestingly, Apple's move away from NVIDIA came at the tail end of an Intel and NVIDIA patent dispute that saw the graphics chip maker halt production for a short period of time at the end of 2009. The tiff was ultimately settled in January 2011 for $1.5 billion.

Monday's announcement signals both a return to NVIDIA silicon and a new marketing strategy for Apple as only one card, the GeForce GT 650M, will be used on MacBook Pro models that feature discrete graphics. Apple will continue to use the Intel HD integrated chipset family which has become the default graphics baseline for all MacBooks from the Air to the Pro.

Comments

  • Reply 1 of 20
    tallest skiltallest skil Posts: 43,388member


    I still don't trust nVidia, but it's nice to see Apple putting faith back in them.

  • Reply 2 of 20
    So if AMD is "discreet" what does that make nVidia? Ill-mannered?

    Update: they fixed it
  • Reply 3 of 20
    ksecksec Posts: 1,569member

    Quote:

    Originally Posted by Tallest Skil View Post


    I still don't trust nVidia, but it's nice to see Apple putting faith back in them.



     


    Well these 28nm Kelper GPU is pretty damn good. And from the looks of things Apple has slowed down the pace of OpenCL, which means the advantage of faster Compute with ATI isn't much use.

  • Reply 4 of 20
    mdriftmeyermdriftmeyer Posts: 7,503member


    What the hell are you folks smoking? Put it down. You are talking out your rear ends. Both Kepler and GCN are discrete GPGPU architectures.


     


    Looks like just when Kepler is starting to stamp out in reasonable yields, AMD has leaked bits on the new 8000 series:


     


    http://www.nordichardware.com/news/71-graphics/46017-kodnamn-foer-amd-radeon-hd-8000-serien-synas-i-senaste-drivrutinerna.html

     

  • Reply 5 of 20

    Quote:

    Originally Posted by Tallest Skil View Post


    I still don't trust nVidia, but it's nice to see Apple putting faith back in them.



    yeah i'm not much a fan of nvidia either.


     


    anyways AMD seems to be having some nice things up and coming

  • Reply 6 of 20
    hattighattig Posts: 860member

    Quote:

    Originally Posted by ksec View Post


     


    Well these 28nm Kelper GPU is pretty damn good. And from the looks of things Apple has slowed down the pace of OpenCL, which means the advantage of faster Compute with ATI isn't much use.



    The GK104 is pretty good.


    This is GK107. Let's wait for the reviews to see how much of an upgrade it actually is.

  • Reply 7 of 20
    wizard69wizard69 Posts: 13,377member
    ksec wrote: »
    Well these 28nm Kelper GPU is pretty damn good. And from the looks of things Apple has slowed down the pace of OpenCL, which means the advantage of faster Compute with ATI isn't much use.

    Except this isn't the case at all, Apple is slowly but surely expanding the use of OpenCL. More so the big advantage for OpenCL is its support in apps which many users do use. That faster compute is very important to many users. I'm not sure where all the misinformation related to OpenCL comes from, it is perhaps one of Apples most successful initiatives in a long time. Further I'm not convinced that the Kepler series was the best choice for a laptop based on power used.
  • Reply 8 of 20
    wizard69wizard69 Posts: 13,377member
    What the hell are you folks smoking? Put it down. You are talking out your rear ends. Both Kepler and GCN are discrete GPGPU architectures.
    You need to look closely at some of the benchmarks floating around. Kepler is crap when it comes to compute, especially double precision. The Kepler NVidia intends to market for compute is still sometime off in the future.
    Looks like just when Kepler is starting to stamp out in reasonable yields, AMD has leaked bits on the new 8000 series:
    I'm still not pleased with the move to NVidia. In a laptop I guess it doesn't matter as today you will often be running on the integrated chip. This does bring up an interesting question though, on the next gen Retina machines do they make use of the integrated video at all?

    What is even more perplexing with this move is that the direction AMD was taking its GPUs seemed to be far more suitable for Apples architecture than NVidia. So I really have to wonder what happened.
  • Reply 9 of 20
    MacProMacPro Posts: 19,851member
    wizard69 wrote: »
    I'm still not pleased with the move to NVidia. In a laptop I guess it doesn't matter as today you will often be running on the integrated chip. This does bring up an interesting question though, on the next gen Retina machines do they make use of the integrated video at all?
    What is even more perplexing with this move is that the direction AMD was taking its GPUs seemed to be far more suitable for Apples architecture than NVidia. So I really have to wonder what happened.

    I'm thinking ... Intel said "Hey Apple, Don't use ATI and we'll get you those new Xeons for the Mac Pros!"
  • Reply 10 of 20
    doh123doh123 Posts: 323member


    whats the love of AMD GPUs?  The drivers are horrid, and there are all types of problems that have to be worked around with those... they are not all that great.

  • Reply 11 of 20
    MarvinMarvin Posts: 15,486moderator
    [quote name="wizard69" url="/t/150639/apple-switches-from-amd-to-nvidia-with-next-generation-macbook-pro#post_2125766"]Kepler is crap when it comes to compute, especially double precision. The Kepler NVidia intends to market for compute is still sometime off in the future.
    [/Quote]

    Both platforms were supposed to be aiming at 5DP GFLOPs per watt but it seems that NVidia's desktop GPUs aren't looking good at all vs AMD's lineup:

    http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-14.html
    http://www.tomshardware.com/reviews/geforce-gtx-680-review-benchmark,3161-15.html

    The 650M GT doesn't look too bad vs the old 6750M:

    http://clbenchmark.com/device-info.jsp?config=12062576
    http://clbenchmark.com/device-info.jsp?config=11905871

    but I expect the 7770M would be comparable, if not more powerful. It might have been an issue of availability, power consumption or any number of things really.
  • Reply 12 of 20

    Quote:


    whats the love of AMD GPUs?  The drivers are horrid, and there are all types of problems that have to be worked around with those... they are not all that great.



     


    It is more hate for Nvidia (not only GPU but Nvidia in general) than love for AMD. With people like Charlie Demerjian constantly bashing Nvidia any chance he gets...true or not, it gets into people's head.


     


    I usually buy AMD CPU and video cards when I used to build my own PC. I never really have any real problem with AMD drivers. At work, however, it is entirely a different story. We do high end scientific modeling software, and after spending many long hours debugging problems only seem to happen with ATI/AMD GPUs, we dropped support for AMD. It is cheaper to buy our customers a $1500 Nvidia card than to delay product release because of AMD related bugs.

  • Reply 13 of 20
    allblueallblue Posts: 393member

    Quote:

    Originally Posted by MyDogHasFleas View Post



    So if AMD is "discreet" what does that make nVidia? Ill-mannered?




    I don't know much about the technicalities, but I'd always assumed that a discreet GPU was one that was too embarrassed to show pr0n.

  • Reply 14 of 20
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by MyDogHasFleas View Post



    So if AMD is "discreet" what does that make nVidia? Ill-mannered?


     


    LOL.

  • Reply 15 of 20
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by wonttell View Post


     


    It is more hate for Nvidia (not only GPU but Nvidia in general) than love for AMD. With people like Charlie Demerjian constantly bashing Nvidia any chance he gets...true or not, it gets into people's head.


     


    I usually buy AMD CPU and video cards when I used to build my own PC. I never really have any real problem with AMD drivers. At work, however, it is entirely a different story. We do high end scientific modeling software, and after spending many long hours debugging problems only seem to happen with ATI/AMD GPUs, we dropped support for AMD. It is cheaper to buy our customers a $1500 Nvidia card than to delay product release because of AMD related bugs.



     


    In 2009-2011 I had endless trouble with AMD drivers in WinXP2/Win7. That's why 10 months ago I finally dumped PC gaming altogether and went to Xbox360. Sad, because ATI came back with utter vengeance with the 5800 series, especially the 5850 which was a godsend. Pity about the drivers though. 


     


    As I've bitched about before, I like that I have an Nvidia320M in my MBP 13" but except for Retina MBP Intel Ivy Bridge graphics are more than sufficient for most people for 13" or even lower-end 15".


     


    This deal with Apple is probably the one bright spot in Nvidia's history this decade.


     


    Not to mention nobody making a "next-gen console" yet. 


     


    At this rate all you need to do is plonk a 2-year-old discrete GPU into the Macbook Air 11", downspec it a little, rip out the screen, and voila. $500 "next gen" super-duper 1080p Unreal Engine 4 DX11-quality gaming console that will blow away anything else except for hardcore PC gaming.


     


    Tim Cook must be laughing his ass off, and I hope Steve is too from up above.

  • Reply 16 of 20


    I miss the days of Diamond Innovation video cards.

  • Reply 17 of 20
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by GoonerYoda View Post


    I miss the days of Diamond Innovation video cards.



     


    I miss my Matrox Rainbow Runner from the late 90's. And then my Voodoo card. That was back when getting a "video card" wasn't the joke it is today*. To be fair, a 8600GT SLI pair was the pinnacle of last decade's GPUs.


     


    *The solution is not throwing thousands of watts and cobbling so many GPUs and hardware together that it could power a small house.

  • Reply 18 of 20

    Quote:


    Not to mention nobody making a "next-gen console" yet. 



    "next-gen console" may be DOA if Apple start streaming games to iDevices soon enough. Gaikai and Nvidia claim they've made online streaming games as smooth as console games. IF that holds up and Apple can do a better job than Gaikai, I wouldn't mind spending $1/hr playing some mindless FPS games on an iTV. 

  • Reply 19 of 20
    brlawyerbrlawyer Posts: 828member

    Quote:

    Originally Posted by Tallest Skil View Post


    I still don't trust nVidia, but it's nice to see Apple putting faith back in them.



     


    Ditto here. I still remember the NVIDIA 8800 GS debacle, as widely reported by thousands - at least Apple had the customer service to replace my iMac's MOBO for free even if it was out of warranty.

  • Reply 20 of 20
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by wonttell View Post


    "next-gen console" may be DOA if Apple start streaming games to iDevices soon enough. Gaikai and Nvidia claim they've made online streaming games as smooth as console games. IF that holds up and Apple can do a better job than Gaikai, I wouldn't mind spending $1/hr playing some mindless FPS games on an iTV. 



     


    Bingo.

Sign In or Register to comment.