Photos claim to show Nvidia GT 650M GPU in Apple's next MacBook Pro

Posted:
in Future Apple Hardware edited January 2014
New photos leaked from China purport to show the logic board of Apple's new 15-inch MacBook Pro, with an unchanged design and a new Nvidia Kepler GeForce GT 650M.

The pair of pictures (1, 2) were first publicized by M.I.C. Gadget, along with purported details about the new MacBook Pro. According to the person who originally posted the pictures, the new 15-inch MacBook Pro will be offered with Intel's latest Ivy Bridge processors in speeds of 2.3, 2.5 or 2.7 gigahertz.

The pictures with the new graphics processor also show Nvidia's GeForce GT 650M with 1 gigabyte of GDDR5 memory.

Also noteworthy is the fact that the purported new MacBook Pro logic board has the same shape, position and screw hole placement as the current generation design. That suggests this new MacBook Pro, if legitimate, will not feature a thinner appearance or redesigned exterior.

AppleInsider was first report last week that Apple is expected to unveil a "new MacBook" series this week that will be positioned between the MacBook Air and MacBook Pro. The "new MacBooks" are expected to have a thinner design lacking an optical disc drive along with high-resolution Retina displays.

GPU


This week, Apple is rumored to unveil a new 15-inch MacBook alongside new 13- and 15-inch MacBook Pros, and new 11- and 13-inch MacBook Airs. A thinner 13-inch "new MacBook" is expected to launch in August of this year by analyst Ming-Chi Kuo of KGI.

Breakdown


The new, thinner MacBooks will reportedly feature a slimmer form factor of 18 millimeters for the 13-inch model and 19 millimeters for the 15-inch, with display resolutions of 2,560-by-1,600 pixels for the smaller model, and 2,880-by-1,800 pixels for the larger. Edges outside the display are supposed to be 50 percent narrower than the MacBook Pro, and the new MacBooks are also rumored to feature a larger battery capacity that will be about 15 to 20 percent greater than the MacBook Pro.

Comments

  • Reply 1 of 20
    iqatedoiqatedo Posts: 1,823member

    Quote:

    Originally Posted by AppleInsider View Post



    New photos ...


    The new, thinner MacBooks will reportedly feature a slimmer form factor of 18 millimeters for the 13-inch model and 19 millimeters for the 15-inch, with display resolutions of 2,560-by-1,600 pixels for the smaller model, and 2,880-by-1,800 pixels for the larger. Edges outside the display are supposed to be 50 percent narrower than the MacBook Pro, and the new MacBooks are also rumored to feature a larger battery capacity that will be about 15 to 20 percent greater than the MacBook Pro.


    "It's full of stars!"

  • Reply 2 of 20
    zoolookzoolook Posts: 657member


    I wonder if the new retina displays will be in both Matte and Glossy. I love my Matte screen, not sure if I am ready for an update but it's looking tempting with the 650m and 384 shader cores (makes the GT330m look pedestrian).

  • Reply 3 of 20
    asciiascii Posts: 5,936member


    Oh no, I hope this doesn't mean the iMac will have Nvidia graphics too. I've been waiting so long to upgrade to this iMac, but I f'in hate Nvidia.

  • Reply 4 of 20
    shenshen Posts: 434member
    Ascii, that emotion is understandable after they murdered kids, raped your wife, shot your dog, and stole your bible the way they did....
  • Reply 5 of 20
    zoolookzoolook Posts: 657member

    Quote:

    Originally Posted by ascii View Post


    Oh no, I hope this doesn't mean the iMac will have Nvidia graphics too. I've been waiting so long to upgrade to this iMac, but I f'in hate Nvidia.



     


    Apple has been alternating between AMD and nVidia for years on GPUs, so it should be no surprise. The 6xx series is very good though, definitely stronger than the 7xxxm series AMD is offering now, so it's the right move for Apple.


     


    When I look at game reviews and feedback on the Mac App store, people with Radeon/AMD graphics seem to complain about performance and stability more - at least I have noticed that anecdotally.

  • Reply 6 of 20


    Ascii: Because the label stamped on the GPU is a problem, why? Its a graphics card, it makes pretty pixels appear on your screen. Why care what brand it is if it can process pretty pixels at neck shattering speeds?


     


    That aside; a GT 650M and ivy bridge?


    tumblr_m51688SIVh1r4gc2c.jpg

  • Reply 7 of 20


    23W vs 45W


    Now we know it will get a much bigger battery.

  • Reply 8 of 20
    asciiascii Posts: 5,936member

    Quote:

    Originally Posted by benanderson89 View Post


    Ascii: Because the label stamped on the GPU is a problem, why? Its a graphics card, it makes pretty pixels appear on your screen. Why care what brand it is if it can process pretty pixels at neck shattering speeds?



    The output of the two brands looks different to me. On a static screen I can't tell, but as soon as something animates or scrolls, I can tell the two apart without fail.

  • Reply 9 of 20
    asciiascii Posts: 5,936member

    Quote:

    Originally Posted by Zoolook View Post


     


    Apple has been alternating between AMD and nVidia for years on GPUs, so it should be no surprise. The 6xx series is very good though, definitely stronger than the 7xxxm series AMD is offering now, so it's the right move for Apple.



    Yep, they do alternate, and I have bought both before, but never really been happy with the Nvidia purchases, so now I always wait for ATI to have it's go-around.

  • Reply 10 of 20
    buckalecbuckalec Posts: 203member


    Apples online store is offline while getting updated

  • Reply 11 of 20
    hkzhkz Posts: 190member
    Ascii: Because the label stamped on the GPU is a problem, why? Its a graphics card, it makes pretty pixels appear on your screen. Why care what brand it is if it can process pretty pixels at neck shattering speeds?

    Obviously you know nothing about graphics cards, drivers, or the difference between AMD and nVidia. It makes a HUGE difference what brand it is. Not all pixels rendered are equally good or reliable and brand difference is a massive part of that.
  • Reply 12 of 20


    So, is this good or bad? I lost track of GPU model numbers long time ago... I remember them being in the thousands around 2007, but somehow there seems to have been a 'reboot'...

  • Reply 13 of 20
    hudson1hudson1 Posts: 800member

    Quote:

    Originally Posted by HKZ View Post





    Obviously you know nothing about graphics cards, drivers, or the difference between AMD and nVidia. It makes a HUGE difference what brand it is. Not all pixels rendered are equally good or reliable and brand difference is a massive part of that.




    If the difference is "huge" as you state, then it implies that if you put two iMacs next to each other -- one with ATi and the other with Nvidia -- everyone would instantly notice the difference.  There just doesn't seem to be much anecdotal evidence to support this.

  • Reply 14 of 20
    applepiapplepi Posts: 365member

    Quote:

    Originally Posted by ascii View Post


    Yep, they do alternate, and I have bought both before, but never really been happy with the Nvidia purchases, so now I always wait for ATI to have it's go-around.





    Guess you'll be able to get a deal on the older model pretty soon.


    Personally I like Nvidia. I'm still on a late 2008 aluminum unibody macbook with a 9400M in it. So this would be the year for me to upgrade.


    I just hope the new models are supported by CS6 for faster rendering. Probably not Cuda but at least via openCL.

  • Reply 15 of 20
    zoolookzoolook Posts: 657member

    Quote:

    Originally Posted by HKZ View Post





    Obviously you know nothing about graphics cards, drivers, or the difference between AMD and nVidia. It makes a HUGE difference what brand it is. Not all pixels rendered are equally good or reliable and brand difference is a massive part of that.




    I do know a lot about graphics cards, and what you just said is unlikely. Performance may vary, so might image quality in 3D rendering, anti-aliasing (especially different applications like CSAA, MSAA, NT, WT or Edge Detect), LOD Bias, sampling, tessellation etc, but the rendering of pixels on a 2D display would be virtually identical, especially as the rasterizer is actually in the Intel hardware, not the discrete chip.


     


    Most of the image quality and performance is determined by the quality of drivers, and nVidia have had better OpenGL drivers than ATI for the better part of a decade.

  • Reply 16 of 20
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by benanderson89 View Post


    Ascii: Because the label stamped on the GPU is a problem, why? Its a graphics card, it makes pretty pixels appear on your screen. Why care what brand it is if it can process pretty pixels at neck shattering speeds?


     


    That aside; a GT 650M and ivy bridge?


     



    Ivy Bridge isn't much of an update. Usb3 and updated gpu would be motivating factors for me, but the processor change means very little.


    Quote:

    Originally Posted by ApplePi View Post




    Guess you'll be able to get a deal on the older model pretty soon.


    Personally I like Nvidia. I'm still on a late 2008 aluminum unibody macbook with a 9400M in it. So this would be the year for me to upgrade.


    I just hope the new models are supported by CS6 for faster rendering. Probably not Cuda but at least via openCL.



    NVidia and AMD can both support OpenCL on recent cards. A couple NVidia ones got the shaft like their Quadro 4000. It never received OpenCL drivers, but overall both brands do OpenCL development, and NVidia doesn't seem to be dumping many resources into CUDA at this point.


    Quote:

    Originally Posted by Zoolook View Post


     


    Apple has been alternating between AMD and nVidia for years on GPUs, so it should be no surprise. The 6xx series is very good though, definitely stronger than the 7xxxm series AMD is offering now, so it's the right move for Apple.


     


    When I look at game reviews and feedback on the Mac App store, people with Radeon/AMD graphics seem to complain about performance and stability more - at least I have noticed that anecdotally.



    It is? I've looked at benchmarks. They don't look conclusive.


    Quote:

    Originally Posted by ascii View Post


    Yep, they do alternate, and I have bought both before, but never really been happy with the Nvidia purchases, so now I always wait for ATI to have it's go-around.



    Well it's just Ivy Bridge. The cpu improvement is unimpressive. If you don't like the gpu, you may as well wait this one out.

  • Reply 17 of 20
    zoolookzoolook Posts: 657member

    Quote:

    Originally Posted by hmm View Post




    It is? I've looked at benchmarks. They don't look conclusive.


     



    You're right, they're not, at least if you compare 680 to 7970 in 3D benchies using DirectX.

  • Reply 18 of 20
    ascii wrote: »
    The output of the two brands looks different to me. On a static screen I can't tell, but as soon as something animates or scrolls, I can tell the two apart without fail.
    hkz wrote: »
    Obviously you know nothing about graphics cards, drivers, or the difference between AMD and nVidia. It makes a HUGE difference what brand it is. Not all pixels rendered are equally good or reliable and brand difference is a massive part of that.

    Apple write the drivers for the mac so that point is void. The difference between the two in terms of display quality has been rendered pretty much moot since digital video interfaces were introduced like DVI and mini display, especially at the high end. Plus, the vast majority of the display quality is determined by the actual display.

    You two sound like those nerds on overclocking forms complaining about the difference and AMD and Intel out of sheer brand loyalty.
  • Reply 19 of 20
    hmmhmm Posts: 3,405member

    Quote:

    Originally Posted by Zoolook View Post


    You're right, they're not, at least if you compare 680 to 7970 in 3D benchies using DirectX.



    I get your point there, but it's not like we've seen both run under OSX.

  • Reply 20 of 20
    hkzhkz Posts: 190member



    Quote:



    Originally Posted by Zoolook View Post

    I do know a lot about graphics cards, and what you just said is unlikely.


     


    Obviously you don't know how to make an argument. I said:


     


    Quote:

    Originally Posted by HKZ View Post

    Obviously you know nothing about graphics cards, drivers, or the difference between AMD and nVidia. It makes a HUGE difference what brand it is. Not all pixels rendered are equally good or reliable and brand difference is a massive part of that.


     


    Then you said:




    Quote:



    Originally Posted by Zoolook View Post


    Most of the image quality and performance is determined by the quality of drivers, and nVidia have had better OpenGL drivers than ATI for the better part of a decade.



     


     


    So which is it? I'm right or I'm wrong that not every pixel is rendered as equally good between them? You tell me I'm wrong and then tell me I'm right? Did I not mention drivers? Can you read? I'll give you the correct answer: I'm right and so are you. You basically repeated what I said but with more detail. 


     


    Quote:

    Originally Posted by benanderson89 View Post

    Apple write the drivers for the mac so that point is void. The difference between the two in terms of display quality has been rendered pretty much moot since digital video interfaces were introduced like DVI and mini display, especially at the high end. Plus, the vast majority of the display quality is determined by the actual display.

    You two sound like those nerds on overclocking forms complaining about the difference and AMD and Intel out of sheer brand loyalty.


     


    You sound like someone that's ignorant about video cards on the whole, and their manufacturers driver support and hardware reliability. There most definitely is a difference there and Apple switched from nVidia to AMD for hardware reliability once already, then had to switch again because of Intel squeezing nVidia out and they couldn't make graphics or chipsets Apple could legally use. No one really seriously games on a Mac anyway, the performance is so laughably awful there's no reason to punish yourself. Most people who game on a Mac play games that either don't use the discrete graphics, or utilize them very little. Anyone who does play games on a Mac uses Bootcamp, like I do, and the difference between the two can be as small as 2% or night and day. Could be so kind as to point out where I was talking about one being better than the other? I wasn't arguing for or against either, I was just pointing out that you made a statement and it's obvious that you know absolutely nothing about graphics cards, their manufacturers, or their capabilities.


     


    DVI, MiniDP, HDMI and so on have absolutely nothing to do with how a video card renders the effects that you see with your eyes. CRT monitors had the same effects quality and better image quality (black levels and other things) that modern LDCs do and it made no difference there either. You truly are ignorant about 3d video rendering, and I don't mean that as an insult. 3d rendering depends solely on the game engine, the graphics hardware and the video drivers to render visuals, it has absolutely nothing to do with what screen it shows up on.


     


    Quote:

    Originally Posted by Hudson1 View Post

    If the difference is "huge" as you state, then it implies that if you put two iMacs next to each other -- one with ATi and the other with Nvidia -- everyone would instantly notice the difference.  There just doesn't seem to be much anecdotal evidence to support this.


     


    In the context of what he said and how it was said, there is a huge difference between nVidia and AMD. What you're suggesting doesn't use the discrete graphics anyway, it would use the intel chip. If you were to play a game, say Diablo 3, on both cards you would most definitely see a difference right away. But if you knew what you were talking about or the context of the discussion you would see it. I'm not going to hold your hand, or his, and tell you why that statement is wrong. The internet is full of threads asking why nVidia cards can do one thing and AMD cards can't and vice versa. For rendering the desktop, no there really isn't a difference and that's simply because neither would be doing anything. When it comes to 3d gaming, and the context that he was talking about, there most certainly is a difference and I've no interest in proving it because I don't have to. Look on google for game launch graphics drivers problems and you'll have more than enough evidence to see the difference between the two, and when you get done with that saunter over to anadtech.com and read some of his reviews on graphics cards there. Just because neither you or him know anything about them doesn't mean there isn't any difference because you don't use them.

Sign In or Register to comment.