Apple rumored to switch to Nvidia graphics for new MacBook Pros

Posted:
in Future Apple Hardware edited January 2014
New reports claim to have confirmed that Apple's upcoming MacBook Pros will eschew AMD graphics processors in favor of GPUs from Nvidia.

The Cupertino, Calif., company's current 15-inch and 17-inch MacBook Pro models make use of AMD's Radeon HD 6750M and 6770M graphics chips. The 13-inch model features integrated Intel graphics.

According to The Verge, a "trusted source" has confirmed that Apple will switch to Nvidia as the supplier of its discrete GPUs for the MacBook Pro. ABC News' Joanna Stern, formerly of The Verge, reported separately that, according to her publication's own sources, the next generation of MacBook Pros will sport Nvidia graphics chips.

Monday's reports counter a rumor from March that Apple had decided to drop Nvidia's Kepler graphics cards from a "large number" of its next-gen MacBook Pros over supply issues.

Chatter surrounding Apple's Mac plans has picked up in recent weeks. Benchmarks reportedly of unreleased MacBook Pro and iMac models were spotted on the GeekBench site late last week. Piper Jaffray analyst Gene Munster issued a note speculating that new MacBooks and iMacs will arrive by June. The inconsistently accurate DigiTimes claimed that component suppliers expect new MacBook models to launch in June.

AMD graphics
Apple's current 15-inch and 17-inch MacBook Pros feature AMD graphics.


Bloomberg also chimed in on Monday with its own sources, claiming that Apple would unveil its new MacBook Pros at the Worldwide Developers Conference in June. According to the report, the next-generation laptops will be thinner and feature Retina Display-like screens.

AppleInsider's own sources revealed in February that Apple is planning a radical redesign of its professional notebooks that should slim down its MacBook Pros and make them more like the MacBook Air.
«134

Comments

  • Reply 1 of 63
    solipsismxsolipsismx Posts: 19,566member
    At least we're no longer hearing rumours about Apple dropping the discrete GPU in their MBPs in favour of Intel's integrated GPUs.
  • Reply 2 of 63
    tallest skiltallest skil Posts: 43,388member


    I'm still not sure I trust nVidia after the 8600M fiasco.

  • Reply 3 of 63
    wizard69wizard69 Posts: 13,377member
    I'm still not sure I trust nVidia after the 8600M fiasco.

    Yep, seems like a bad move on Apples part. Especially when AMD seems to have a better track record of the last couple of years.
  • Reply 4 of 63
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Tallest Skil View Post


    I'm still not sure I trust nVidia after the 8600M fiasco.



    I like their desktop graphics cards. Aside from that, digitimes has been trolling everyone all day today.

  • Reply 5 of 63


    Makes sense considering all the work that has been going on with the Nvidia drivers for ML

  • Reply 6 of 63
    erioerio Posts: 28member
    Welcome back, flickering screen. Yay!
  • Reply 7 of 63
    doh123doh123 Posts: 323member


    after the 8600 though, Apple still used several others... The 9400s, 9600s, 320m, 330m were all after that and all didn't have the issues of the 8600.

  • Reply 8 of 63


    if the new mbps are released around june 11th...when would they be in the stores ready to be bought? 

  • Reply 9 of 63
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by wizard69 View Post





    Yep, seems like a bad move on Apples part. Especially when AMD seems to have a better track record of the last couple of years.


     


    Quote:

    Originally Posted by SolipsismX View Post



    At least we're no longer hearing rumours about Apple dropping the discrete GPU in their MBPs in favour of Intel's integrated GPUs.


     


    Quote:

    Originally Posted by hmm View Post


    I like their desktop graphics cards. Aside from that, digitimes has been trolling everyone all day today.



     


    In any case the discrete GPU industry is on its way out. Only certain MBP 15" and all MBP 17" - esque MBPs will have discrete GPUs. AMD and Nvidia blew it, and Intel steamrolled them, legally and illegally (eg. locking out Nvidia). In the meantime PowerVR and ARM is eating everyone's breakfast, lunch and soon, dinner.


     


    As I said before, my username is perhaps the peak of discrete GPUs (Even then the 8600M fiasco was quite bad, though that G92 GPU design was superb).


     


    I have the famed Nvidia 320M in my MBP 13" 2010. It's okay, but compared to a integrated Intel MBP 13" nothing great.


     


    So many laptop discreet GPUs in any case are such crippled versions of their desktop brethren the difference between them and Intel is nothing more than marketing. 2GB VRAM on a useless laptop discrete GPU? That's like those $300 HDMI cables.


     


    Throw in the whole "casual gaming" phenomenon and that's the killing blow to the discrete GPU industry.


     


    Consider this: just at the time when GPUs became ever more hot, heavy, expensive and noisy to support ever more complex, risky and expensive (software and hardware) games, people gravitated towards simpler, alternative "casual" games. The "perfect storm" that crushed almost all discrete GPU dreams.


     


    The only decent GPUs in the future will be:


     


    1. Megalithic desktop 500W-1KW multi-card setups, ie. niche stuff


     


    2a. Intel Integrated which will benefit from modest improvements in GPU architecture but huge gains in CPU power (ie. GPGPU not so essential because CPU still handles a lot of tasks, including custom routines for video encoding said to be the forte of GPUs, which is now bollocks)


     


    2b. Intel Integrated which will benefit a lot from process improvements which currently outpace anything TSMC/ AMD/ Nvidia can achieve


     


    3. PowerVR which will come in from the ground up, ie. iPad 3/4/5 GPU is the next great gaming GPU.


     


    So on one hand we have niche "high-perfomance" stuff that has no real world mainstream application, aging gaming consoles with now very paltry graphics, "next-gen" gaming consoles which are still dicey in terms of business "models", Intel Integrated which is sufficient for mainstream computing but nowhere near gaming-class,


     


    And on the other hand... iPad. 'Nuff said.

  • Reply 10 of 63
    mdriftmeyermdriftmeyer Posts: 7,503member

    Quote:

    Originally Posted by BillyLavoie View Post


    Makes sense considering all the work that has been going on with the Nvidia drivers for ML



     


    Like hell it does. Nvidia's Mobile chips pale in comparison to price and power wrt AMD, not to mention OpenCL performance.

  • Reply 11 of 63
    ksecksec Posts: 1,569member
    Unless Nvidia have something up the sleeves, I wouldn't want a Mobile GK107 on a Macbook. The GK107 would be faster then 6770M, but we are only talking about ~ 20% difference. Since the new GPU would be based on 28nm I expect something much more powerful.
  • Reply 12 of 63
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by mdriftmeyer View Post


     


    Like hell it does. Nvidia's Mobile chips pale in comparison to price and power wrt AMD, not to mention OpenCL performance.



     


    True, AMD/ATI turned their boat 180 degrees with the 5000 Series, particularly the flagship performance/watt/price king, 5850. Nvidia has been simply coasting sideways at best since the G92 (8600, 9600, GT250/260/whatever).


     


    Still, AMD's drivers are horrible (endless updates, Need For Speed: Shift and other games inextricably laggy), while Nvidia supposedly has better drivers but again laptop discrete GPUs for the cost ain't that fantastic.


     


    OpenCL is a great idea, but in practice has met little success. Folding@Home on GPU was for many years incredibly unstable, OpenCL or not.

  • Reply 13 of 63
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by ksec View Post



    Unless Nvidia have something up the sleeves, I wouldn't want a Mobile GK107 on a Macbook. The GK107 would be faster then 6770M, but we are only talking about ~ 20% difference. Since the new GPU would be based on 28nm I expect something much more powerful.


     


    That's assuming they can get over their endless process issues with TSMC or whoever the fab-du-jour is. And for some reason Samsung's fabs have not really breached the market for making low-process, great performance/watt laptop/desktop-class GPUs.


     


    It appears that due to "competition" everyone is going in five different directions, and the "farm" (as Steve jobs put it) has all these animals that don't make sense. You've got an obese chicken that doesn't lay eggs, an anorexic pig that produces milk, and a horse with three legs that poops spinach coming out of the barn. (Not to mention Sony's Cell processor which is now kinda like an upside-down cow - no surprise since the PS3 looks like it's made for grilling hamburgers)


     


    An industry-wide chaos that causes indigestion for the consumer.


     


    In terms of making kickass GPUs: Intel has the best fabs. PowerVR and ARM has the best mobile architecture. AMD has the best high-performance desktop/laptop architecture. Apple has the best stable software platform.


     


    Wouldn't it be a better world where AMD designs high-performance desktop/laptop architecture, ARM and PowerVR designs the mobile architecture, Intel fabs them, and Apple features them on iOS, Mac and gaming platforms eg. iPad, AppleTV? Tell me this doesn't sound attractive to almost all consumers.


     


    Of course, if most of the world had these kinds of stellar collaboration, we could probably eradicate poverty by 2050. That said, in the tech world at least, it is possible, usually with a visionary leader, such as Steve Jobs. So, will the next Steve Jobs please stand up, please stand up, please stand up?


     


    RIP Voodoo PC, by the way, the dude in charge talked up a lot of hype post-buyout by HP. Sadly, nothing much came out since then.


     


    On a final note, who needs a beyond-average GPU in a laptop anyway? Only certain niches (aka "verticals"). Gaming has potential but Windows is nonsense compared to the simplicity and tradeability of Xbox, PS and Wii, and Mac titles are as always, progressing but still laughable.

  • Reply 14 of 63


    I just started a video production course with a four year old Mac Book. I will need a better computer after I graduate. What hardware part is the most important for editing and creating video movies? I read about the GPU and the CPU and how over time both are improving. At this time discrete graphics processors seem to be ideal for video games or watching movies but how do they figure into editing and creating movies?


     


    Having the most RAM possible is probably a good thing for all computing things. I just want to know what hardware item to concentrate on finding when I shop for my next computer. Experts please advise me. Screen size isn't important on a lap top because I use an external monitor on my desk.

     

  • Reply 15 of 63
    kpluckkpluck Posts: 500member

    Quote:


    I have the famed Nvidia 320M in my MBP 13" 2010. It's okay, but compared to a integrated Intel MBP 13" nothing great.



     


    The 320M is a more powerful GPU than Intels HD3000. It is only the better CPU in the newer MacBook Pro 13" that makes them faster. From a GPU standpoint, the current 13" was a downgrade.


     


     


    Quote:


    The new 13in MacBook Pros and their Intel HD Graphics 3000 processors weren't that impressive in our games tests, scoring lower than the older 13in systems with Nvidia GeForce 320M integrated graphics. In the 1024-by-768-resolution Call of Duty test, the 13in 2.3GHz Core i5 MacBook Pro displayed 26 fps (frames per second) on average, while the 13in 2.7GHz Core i7 MacBook Pro averaged 27 fps. Those results are well below the 33 fps displayed by the older 13in 2.66GHz Core 2 Duo MacBook Pro with Nvidia graphics.



     


    While a single game is far from a perfect benchmark, it is not hard to find other tests that back up that up.


     


    -kpluck

  • Reply 16 of 63
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by kpluck View Post


    The 320M is a more powerful GPU than Intels HD3000. It is only the better CPU in the newer MacBook Pro 13" that makes them faster. From a GPU standpoint, the current 13" was a downgrade.


     



     


    While that is true, that's also the spectre discrete GPUs face. The 13" was a downgrade no doubt, but how many people "suffered" as a result? Since CPU power massively improved, the GPU downgrade was not really felt (aside from high-end games, which is rare for Mac users).


     


    Even for games, given the way the ports are done as well, Pyschonauts is unplayable on my 320M even though it's not Intel, and even though that game is, well, very, very old.


     


    So while I enjoy the 320M's advantages in say, OpenGL Photoshop where zoom levels are antialiased/sampled properly, increasingly Intel GPUs can do this.


     


    iPhoto and iMovie can be GPU-intensive to leverage the GPU, but on my MBP 13" 320M Core 2 Duo, using the GPU for certain iPhoto and iMovie tasks is ~slower~ compared to Sandy Bridge.


     


    I guess I'm proposing that the promise of discrete GPUs has been obliterated by poor real-world implementation.


     


    The discrete GPU companies have also now painted themselves into a corner:


     


    How is this 100W-200W+ behemoth ever going to benefit our new mobile, tablet, sleek and slim lifestyle while still pushing ever greater interactive experiences?


     


    GeForce_GTX_670_F-1_575px.jpg

  • Reply 17 of 63
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by mdriftmeyer View Post


     


    Like hell it does. Nvidia's Mobile chips pale in comparison to price and power wrt AMD, not to mention OpenCL performance.



    In laptops maybe, and not by much. Where are you getting your information?


    Quote:

    Originally Posted by Smallwheels View Post


    I just started a video production course with a four year old Mac Book. I will need a better computer after I graduate. What hardware part is the most important for editing and creating video movies? I read about the GPU and the CPU and how over time both are improving. At this time discrete graphics processors seem to be ideal for video games or watching movies but how do they figure into editing and creating movies?


     


    Having the most RAM possible is probably a good thing for all computing things. I just want to know what hardware item to concentrate on finding when I shop for my next computer. Experts please advise me. Screen size isn't important on a lap top because I use an external monitor on my desk.

     



    You are asking the wrong questions. You need to do some research because it depends on the application and how you will be using it. We're well past the days of what computer do I need for X application. I wouldn't worry about the computer until you're close to graduation anyway, as requirements will change in that time.


    Quote:

    Originally Posted by nvidia2008 View Post


     


    True, AMD/ATI turned their boat 180 degrees with the 5000 Series, particularly the flagship performance/watt/price king, 5850. Nvidia has been simply coasting sideways at best since the G92 (8600, 9600, GT250/260/whatever).


     


    Still, AMD's drivers are horrible (endless updates, Need For Speed: Shift and other games inextricably laggy), while Nvidia supposedly has better drivers but again laptop discrete GPUs for the cost ain't that fantastic.


     


    OpenCL is a great idea, but in practice has met little success. Folding@Home on GPU was for many years incredibly unstable, OpenCL or not.



    I disagree. OpenCL has seen some nice adoption, and it's extremely useful in many more things today. More people benefit from a strong gpu and OpenCL support today than would have a few years ago.


    Quote:

    Originally Posted by ksec View Post



    Unless Nvidia have something up the sleeves, I wouldn't want a Mobile GK107 on a Macbook. The GK107 would be faster then 6770M, but we are only talking about ~ 20% difference. Since the new GPU would be based on 28nm I expect something much more powerful.


    Is this year's AMD significantly better? Mobile gpus were NVidia's weakest point.  It's a troll rumor anyway. Remember a few months ago? We had the same rumor, then a rumor they couldn't satisfy Apple, now another rumor of NVidia, all from the same source. Laptops debuting later than initially expected by the masses = many many fabricated rumors.

  • Reply 18 of 63
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:

    Originally Posted by Smallwheels View Post


    I just started a video production course with a four year old Mac Book. I will need a better computer after I graduate. What hardware part is the most important for editing and creating video movies? I read about the GPU and the CPU and how over time both are improving. At this time discrete graphics processors seem to be ideal for video games or watching movies but how do they figure into editing and creating movies?


     


    Having the most RAM possible is probably a good thing for all computing things. I just want to know what hardware item to concentrate on finding when I shop for my next computer. Experts please advise me. Screen size isn't important on a lap top because I use an external monitor on my desk.

     



     


    Well, in my opinion, if you are using your Mac for serious stuff, you have to consider it in totality. First, the hard disk, which is often overlooked. SSD is absolutely essential, with Thunderbolt to an external RAID0 (or 0+1 or something like that, somebody else probably knows better). Next, RAM, indeed, don't skimp on it. Start off with 8GB, use any "Apple-compatible" RAM, just not recommended to get regular PC or "Value" RAM.


     


    Next CPU... In this case you want as fast as you can afford, because encoding, decoding, compression and so on is very CPU-intensive.


     


    For the GPU, in this case yes, discrete GPUs are your only option.


     


    Since you'll be getting a Mac, and you don't need a big screen, the choice is pretty simple. Get the ~fastest possible~ ie. best CPU and best GPU you can put into a custom MacBook Pro 15" with 8GB of RAM and a 300GB+ SSD. You save some of the money going towards the 17" laptop by instead "specing-out" the 15" MBP to the max.


     


    My 2 cents :)

  • Reply 19 of 63
    timbittimbit Posts: 331member
    GPU is important for rendering complex video, but you also need a good CPU to compliment it. If you only get a good GPU and a crappy CPU, you will bottleneck because your computer won't be able to keep up. You should spend roughly the same amount for the CPU and GPU so look them up and make sure your GPU is not more powerful.
    You were correct in saying that you need lots and lots of speedy RAM. This is very important for video editing.
  • Reply 20 of 63
    nvidia2008nvidia2008 Posts: 9,262member

    Quote:


    Originally Posted by hmm View Post



    I disagree. OpenCL has seen some nice adoption, and it's extremely useful in many more things today. More people benefit from a strong gpu and OpenCL support today than would have a few years ago.


     


     



     


    Interesting, name me 5 OpenCL OSX 10.7 apps on the Mac App Store that qualify for a decent, mainstream Mac experience. Core Image doesn't count, has to be mostly OpenCL, that is, the app calls the OpenCL stuff directly not OpenCL indirectly by calling to Core Image only.


     


    "Analog" on the Mac App Store is really cool, but it's using my GPU for Core Image calls to do the filtering, and it's laggy at times, if it used pure CPU power on a higher-end Sandy Bridge MBP, it might be faster.


     


    And with Ivy Bridge just around the corner, the case for discrete GPUs given the sheer power of Ivy Bridge CPU components is diminishing fast for ~mainstream laptop~ computing.


     


    As far back as 2005, desktop sales jumped off a cliff and that was the point they took discrete GPUs with them.


     


    IMG_2582.JPG

Sign In or Register to comment.