Rumor: Apple drops Nvidia's Kepler GPUs from 'large number' of next-gen MacBook Pros

124»

Comments

  • Reply 61 of 78
    physicsphysics Posts: 24member
    I've been using a 17" MBP with the NVIDIA 9600/9400 GPUs since early 2009 and have been considering getting another 17" MBP when the Ivy Bridge models come out later this year. I like the proposed changes, such as elimination of the optical drive and conversion to SSDs but I will not upgrade if Apple doesn't have at least a capable OpenGL/CL compliant card on board. I would prefer this to be an NVIDIA card but a good AMD card would be acceptable. If Apple can't deliver on a good high-performance machine then I may have to go with a Clevo chassis based portable running desktop components and a FreeBSD OS.
     0Likes 0Dislikes 0Informatives
  • Reply 62 of 78
    elijahgelijahg Posts: 2,901member
    Apple managed to fit a Geforce 5200 into the 12" PowerBook and get good battery life around 6-7 years ago. You'd think they'd be able to take some miniaturisation cues from the iPhone and fit a dedicated GPU into a 13" chassis too.
     0Likes 0Dislikes 0Informatives
  • Reply 63 of 78
    ssquirrelssquirrel Posts: 1,196member
    Quote:
    Originally Posted by Zoolook View Post


    I'll need more than that awful unreadable site's benchmarks to convince me otherwise.



    Here are some benchmarks on a website that wasn't designed by a toddler



    http://www.anandtech.com/show/5626/i...re-i7-3770k/11



    Note the competing devices are bargain-barrel $40 GPUs, which is barely keeps up with. If you're happy running ultra-low resolution and low-quality texture games, you might see 30 - 50fps (note minimum framerates are more important than average ones).



    Yeah, their page doesn't look the best, but benchmarks are benchmarks, whether you can be bothered w/them or not. Also the cards that the 4000 is equivalent to are $60-$80 cards, not $40. Minor difference yes, but we are also talking about laptops and battery life is viewed as much more important to Apple than it is to most Windows based makers. You are bringing up Crysis and Metro benchmarks, neither of which are on OS X I don't think.



    Does it play WoW fine? Yep. Will it play Diablo 3? You betcha. I could give a shit otherwise. I still think this rumor is bogus and not only will we be seeing AMD in the new MBP line, but we will see a discrete card in the 13" if they get rid of the optical. Which will make this whole discussion irrelevant.





    Quote:
    Originally Posted by simtub View Post


    No chance of Apple ever using Power VR graphics chips on the MacBooks? If a quad core graphics chip can power a retina display iPad i'm pretty sure there is a more advanced model available for laptop class graphics.





    You would be wrong too.
     0Likes 0Dislikes 0Informatives
  • Reply 64 of 78
    ssquirrelssquirrel Posts: 1,196member
    Quote:
    Originally Posted by mdriftmeyer View Post


    Very few Ivy Bridge mobile support the 4000 HD graphics set up. The rest use the 2500 HD.



    http://en.wikipedia.org/wiki/Ivy_Bri...ile_processors




    This wikipedia link doesn't show which models will sport the 4000 and which don't. Also, the anandtech article from 6 days ago only lists desktop processors. http://www.anandtech.com/show/5626/i...ore-i7-3770k/3







    From http://en.wikipedia.org/wiki/Sandy_bridge :

    All mobile processors, except Celeron and Pentium, use Intel's Graphics sub-system HD 3000 (12 EUs).



    Some googling and using the below site for info:

    cpu-world.com



    3920XM 4000

    3820QM 4000

    3720QM 4000

    3615QM 4000

    3612QM 4000

    3610QM 4000

    3667U 4000

    3517U 4000

    3520M 4000

    3427U 4000

    3360M 4000

    3320M 4000

    3217U 4000

    3317U 4000





    Oh look, every freaking mobile model uses the best possible GPU. Maybe b/c many desktop purchasers will be using discrete anyway? Before you make claims like the one I quoted about the 4000 and 2500, try having some idea what you are talking about. The 2500 is a desktop option only.
     0Likes 0Dislikes 0Informatives
  • Reply 65 of 78
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by chabig View Post


    I have a 2011 MBP with Intel graphics driving the 27" Cinema Display and it runs Mac OS and two Windows instances in Parallels just fine. I can only imagine a dedicated GPU is needed for play. For work, Intel's integrated stuff is fine, and Ivy Bridge is supposed to be getting better.



    In fact you are so off base it is likely a waste to say more.
     0Likes 0Dislikes 0Informatives
  • Reply 66 of 78
    wizard69wizard69 Posts: 13,377member
    I really don't understand why threads about GPUs always degenerate into discussions about gaming. Really guys gaming has nothing to do with it. First this is a MAC and one would have to nuts to buy one for hardcore gaming. Second, GPU use has long ago been woven into apps and the OS, this acceleration only becomes more deeply embedded into the software of the future.



    In any event one thing that needs to be understood is that I really believe we are close to the point where on chips GPUs will be the low cost solution to midrange performance. However that isn't Intel at this stage. So even though it would be foolish to release any sort of Pro marketed computer today without integrated graphics that won't be the case in the future.



    Is Apple where to go to integrated GPU only platforms they would have to start looking at AMD closely where GPU performance is important. AMD simply has a sounder product map for the future. Even today AMD hardware supports standards that Intel doesn't. In very real terms Intel GPUs are nothing but junk for professional use. To say otherwise is to disregard the needs of professionals which by the way can vary widely.
     0Likes 0Dislikes 0Informatives
  • Reply 67 of 78
    jlanddjlandd Posts: 873member
    Now that more info has come out it's pretty clear it won't be a major shift. Only change is that instead of only the lowest models having integrated graphics it's looking like half the midline will as well. And the upper tier MBPs aren't losing GPUs so there's not much to argue about.
     0Likes 0Dislikes 0Informatives
  • Reply 68 of 78
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by jlandd View Post


    Now that more info has come out it's pretty clear it won't be a major shift. Only change is that instead of only the lowest models having integrated graphics it's looking like half the midline will as well. And the upper tier MBPs aren't losing GPUs so there's not much to argue about.



    What is with some of you? This is not more info. This is a rumor, and not even a very credible one. There's a good chance it's just one of the blogs fishing for page hits.
     0Likes 0Dislikes 0Informatives
  • Reply 69 of 78
    sunilramansunilraman Posts: 8,133member
    Goodbye Nvidia, how I knew thee... Farewell, my friend.
     0Likes 0Dislikes 0Informatives
  • Reply 70 of 78
    sunilramansunilraman Posts: 8,133member
    Quote:
    Originally Posted by hmm View Post


    What is with some of you? This is not more info. This is a rumor, and not even a very credible one. There's a good chance it's just one of the blogs fishing for page hits.



    Nah mate, Nvidia and ATI have struggled with their GPUs for years now... Regardless of how accurate this story is, the truth is that in the mobile and tablet space PowerVR and ARM have eaten their lunch (and dinner).



    In the laptop space nothing comes close to Intel's Sandy and Ivy Bridge power and efficiency for mainstream computing.



    All that's left for Nvidia and ATI is the higher-end PCs and even then they've struggled for far, far too long... Driver issues, endless updates, chip failures, fabrication issues all the way back to 90nm, let alone <20nm, and so on.



    While not doomed, they are relegated now to the niche, when just 3 years ago they were poised to truly be revolutionary with tech, GPGPU and so on. Suffice to say now we rely on phone graphics to power most of our games, phone graphics that will soon dominate console graphics, and "CPU" aka Intel graphics that will in a few years obliterate discrete graphics from mainstream computing.



    My Nvidia 320M is a very capable graphics device, but the fact I can't play an age-old game like Psychonauts (literally, it's impossibly laggy), and almost NO GPGPU applications aside from Apple and others... Their time has come. The blame game of developers, Apple, Windows, drivers, OS X, OpenGL, etc. etc. is no longer acceptable. When I can get a superb once-in-a-lifetime experience with Mass Effect 3 on obsolete hardware (Xbox360), Nvidia and ATI... Today I say goodbye. (Yes, I know Nvidia/ATI are used in consoles, but that's almost the last time they did anything fantastic)
     0Likes 0Dislikes 0Informatives
  • Reply 71 of 78
    realisticrealistic Posts: 1,154member
    Quote:
    Originally Posted by I am a Zither Zather Zuzz View Post


    It never was "Pro".



    Pro is an example of Iconic Branding. It is meaningless marketing-speak for those who think product acquisition determines what sort of a person they are.



    They are very nice laptop computers. "Pro" ain't got no meaning other than branding.



    Are you in the 5th grade?
     0Likes 0Dislikes 0Informatives
  • Reply 72 of 78
    ssquirrelssquirrel Posts: 1,196member
    Quote:
    Originally Posted by Realistic View Post


    Are you in the 5th grade?



    Frankly that's an insult to 5th graders everywhere
     0Likes 0Dislikes 0Informatives
  • Reply 73 of 78
    hmmhmm Posts: 3,405member
    Quote:
    Originally Posted by sunilraman View Post




    All that's left for Nvidia and ATI is the higher-end PCs and even then they've struggled for far, far too long... Driver issues, endless updates, chip failures, fabrication issues all the way back to 90nm, let alone <20nm, and so on.



    While not doomed, they are relegated now to the niche, when just 3 years ago they were poised to truly be revolutionary with tech, GPGPU and so on. Suffice to say now we rely on phone graphics to power most of our games, phone graphics that will soon dominate console graphics, and "CPU" aka Intel graphics that will in a few years obliterate discrete graphics from mainstream computing.




    Intel seemed to be having trouble with 22nm fabrication too. GPGPU supposedly hit a lot of limitations in areas where it was expected to be big. My point wasn't that we couldn't trend toward integrated graphics in Macs. My point was that they're just riding speculation for page hits. As soon as tech blog articles started to surface about Kepler, we saw a rumor that they'd be used in Macs. Now the rumor is that the Kepler rumor was true, but they moved on. I tend to doubt rumors that claim to have an actual play by play that pander to whatever seems likely to bring page hits at that given moment. Anyway I'm not arguing that discrete graphics may be a thing of the past within a few years in most machines given that they've made their way into light desktops in the past few years as a cost cutting measure.
     0Likes 0Dislikes 0Informatives
  • Reply 74 of 78
    wizard69wizard69 Posts: 13,377member
    Quote:
    Originally Posted by sunilraman View Post


    Nah mate, Nvidia and ATI have struggled with their GPUs for years now... Regardless of how accurate this story is, the truth is that in the mobile and tablet space PowerVR and ARM have eaten their lunch (and dinner).



    They struggle no more than any other chip maker. PowerVR has been doing very well but that has more to do with NVidia and AMD not focusing on the markets. Point in fact head rolled at AMD recently because they missed the portable market completely. That however had nothing to do with their technology.

    Quote:

    In the laptop space nothing comes close to Intel's Sandy and Ivy Bridge power and efficiency for mainstream computing.



    Again this is simply wrong most people would be better off with an AMD Llano due to its far better GPU.

    Quote:

    All that's left for Nvidia and ATI is the higher-end PCs and even then they've struggled for far, far too long... Driver issues, endless updates, chip failures, fabrication issues all the way back to 90nm, let alone <20nm, and so on.



    You really can't be serious when companies like Intel have had the very same issues. As for ARM all you are hearing about is their success stories, they have had some pretty dramatic failures in the market also.

    Quote:

    While not doomed, they are relegated now to the niche, when just 3 years ago they were poised to truly be revolutionary with tech, GPGPU and so on. Suffice to say now we rely on phone graphics to power most of our games, phone graphics that will soon dominate console graphics, and "CPU" aka Intel graphics that will in a few years obliterate discrete graphics from mainstream computing.



    Why do you think that AMD bought ATI and further why did ATI not resist the take over? It was because it was obvious in the industry that SoC technology was and is where the industry is going. This move has given AMD a significant advantage in certain markets.

    Quote:

    My Nvidia 320M is a very capable graphics device, but the fact I can't play an age-old game like Psychonauts (literally, it's impossibly laggy), and almost NO GPGPU applications aside from Apple and others...



    There are also plenty of games not playable on iOS devices. You made a bad buying decision in going with the 320M, crying in the forums about it does not help support your position.



    As to GPGPU acceleration, it is used by almost anybody that can benefit from it. Seriously I'm not sure why people can't grasp this, GPU acceleration is only of value if there is a fit with the application. If you look at things like Safari there is partial us of GPU acceleration there.

    Quote:

    Their time has come. The blame game of developers, Apple, Windows, drivers, OS X, OpenGL, etc. etc. is no longer acceptable. When I can get a superb once-in-a-lifetime experience with Mass Effect 3 on obsolete hardware (Xbox360), Nvidia and ATI... Today I say goodbye. (Yes, I know Nvidia/ATI are used in consoles, but that's almost the last time they did anything fantastic)



    So what you are saying is that you can't get good results on PC hardware for gaming? Many would argue with you there. I on the other hand would say who cares, I'm running a Mac which has no share of the gaming market. That however does not mean that I don't need a decent GPU on any machine I buy. Like it or not a decent GPU has a huge impact on system performance or feel.



    One other thing related to GPGPU computing, OpenCL has been extremely successful for an Apple initiative. Adoption has been fairly massive, so please don't squeak about the lack of OpenCL based programs.
     0Likes 0Dislikes 0Informatives
  • Reply 75 of 78
    Charlie Demerjian is the author of the rumour. For me this source is even less reliable in predictions than DigiTimes speaking on the iPad HD, especially when the rumour has something to do with NVIDIA. Does his name ring a bell? Do you remember a statement that ?Apple switching MacBook Airs to ARM processors - it?s a done deal?? Yep, same guy.



    Charlie was (is?) an ATI?s attack pet from the time before AMD ownership of Canadian company. Usually his rumours came in two parts:



    1. NVIDIA has some contract secured with company A.

    2. NVIDIA loses contract with company A due to quality / yield / drivers issues.



    Part one followed part two without an independent source confirming any of the statements. I don?t say that the rumour here is totally in Charlie?s imagination, it?s just when Charlie says ?it?s a done deal? it actually means ?some tests were conducted?.



    Of course, maybe I?m being biased here, but Apple Insider, could you please mention the name of the source in the news in this case? Would save me from reading.
     0Likes 0Dislikes 0Informatives
  • Reply 76 of 78
    jragostajragosta Posts: 10,473member
    Quote:
    Originally Posted by SSquirrel View Post


    X1600

    http://www.notebookcheck.net/ATI-Mob...00.2163.0.html



    HD3000 (Sandy Bridge iGPU)

    http://www.notebookcheck.net/Intel-H...0.37948.0.html



    HD4000 (Ivy Bridge iGPU)

    http://www.notebookcheck.net/Intel-H...0.69168.0.html



    The HD4000 is seeing double the results of the 3000 in 3D Vantage tests. The 3DMark 06 tests are nearly double and quadruple your current results. The X1600 is ancient and even integrated graphics smoke it now. You should see ridiculous improvements well beyond the 10-20% you are talking about.











    http://store.apple.com/us/browse/hom...ly/macbook_pro



    You mean like this *points to 13" MBP* Apple said they couldn't fit a discrete card in there and keep things cool and maintain battery life. If they are removing the optical drive in the next gen, they should be able to do all that.



    It's going to depend on the application and how old a computer you're replacing. I have a 5 year old MBP and performance isn't really much of an issue for my work. I could use modern integrated graphics and not miss the dedicated GPU. Besides, I don't think there's any way that Apple's going to offer only integrated graphics across the board. They will certainly continue to offer discrete graphics.



    The other important factor to consider is that there are tradeoffs. For people who don't need the performance of discrete graphics, getting rid of the GPU means that you have more power left for other things. You also save money and have room for a bigger battery. You could use that extra money and battery power to use a faster CPU or more cores in the CPU. So for some purposes, you could end up with a FASTER system by dropping the discrete GPU. (but, again, I don't see that happening across the board).



    Quote:
    Originally Posted by mgsarch View Post


    His point was completely valid. Why would they have to release them? We know what the old generations does and we can speculate on what using Ivy Bridge's HD4000 will do by using existing benchmarks on the web. No matter how much magic you think Apple has, they can't make the HD4000 outperform a dedicated solution. Fact.



    His point was that there's no point in getting excited about a silly rumor from an unreliable source. If you want to make a hypothetical argument, that's one thing, but acting like this rumor has any validity is a waste of time.
     0Likes 0Dislikes 0Informatives
  • Reply 77 of 78
    Quote:
    Originally Posted by AppleGreen View Post


    Higher end 15- and 17- inch MacBook Pros launched early last year relied solely on AMD graphics.....



    None of the 13-inch models have ever had discrete graphics.



    And as long as Apple keeps that trend going, I won't ever buy another MacBook again. Currently they have the excuse of not having enough space on the 13" due to the gigantic and useless Optical Drive. Should they remove it and CONTINUE to not offer a dedicated GPU option for the 13" model, I'll have to switch back and retain my iPhone as only Apple product \
     0Likes 0Dislikes 0Informatives
  • Reply 78 of 78
    Of course, should Apple ever release an external GPU via Thunderbolt, like some Sony recently did with some VAIO, all would be forgiven
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.