Apple dumping Intel chipsets for NVIDIA's in new MacBooks

124

Comments

  • Reply 61 of 99
    Why would Apple use a 9600M GT when they could use a 9650M GT which is much faster and 55nm?
  • Reply 62 of 99
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by Machead99 View Post


    Why would Apple use a 9600M GT when they could use a 9650M GT which is much faster and 55nm?



    Why does Apple do anything? I hope it is a 9650M GT, but who knows nowadays. Until the Creation receives His Blessing, who knows what products make it and what they have.



    Pls pls pls pls pls 9650M GT??? Come on Apple....!!!
  • Reply 63 of 99
    Quote:
    Originally Posted by nvidia2008 View Post


    Hmmm... What the rumours seem to say are a 9300 or 9400 discrete GPU for the MacBook...



    Everything I have read says the graphics on the macbook are definitely going to be integrated... but based upon the 9300M discrete card or on it's performance level. Oddly enough, there is no "9400" -- nVidia actually uses "9400" to describe the performance specification that is created when you combine the 9100M integrated graphics with a 9300M discrete card.



    I know, confusing, right? nVidia and their crazy naming schemes... *sigh*...



    Quote:
    Originally Posted by wheelhot View Post


    True but you don't need the best GPU for that.

    *snip* Im not saying that the MB shouldn't have a proper GPU or that the MBP could have improved GPU but its not a big deal not having the best GPU. Most people seem to make a big deal about it, and sadly most of these people give their reasoning is to play high-end games. *snip*

    I understand the need of a good GPU for certain softwares but you usually don't need the best.



    In both the Macbook and the Macbook Pro, we are not talking about the "best". The macbook currently uses the *worst* (mainstream) graphics solution on the market. Going to a new integrated chipset from nVidia is hardly overkill. Many media applications that use CoreImage/OpenGL can't even RUN on a macbook. Literally, it's not about bad performance -- it won't even LOAD.



    The Macbook Pro's 8600GT isn't a total slouch, but it is certainly not a "pro" card. Apple should really offer a decent mobile Quadro card as an option.





    Quote:
    Originally Posted by Haggar View Post


    So what are the chances of the Mac Pro getting SLI capability? SLI would provide more bang for the buck than a single Quadro card. SLI can also support 2 Quadro cards. Surely, 2 Quadro cards running SLI would be more powerful than a single Quadro card.



    "bang for the buck" is not what Quadro cards are about. They are professional grade cards that are optimized for very high-reliability and high-accuracy rendering for CAD/CAM/3D modeling. The drivers are much more rigorously verified and offer many advanced OpenGL features not available in Geforce cards. On the other hand, Geforce cards are optimized for high frame-rates and performance, not accuracy and reliability.
  • Reply 64 of 99
    Quote:
    Originally Posted by solipsism View Post


    Questions and concerns I have about ising NVIDIA over Intel chipsets:

    How much smaller is this North and Southbridge combined chipset over Intel's offerings? Combining ships into one is great, but if the result is only a couple percent difference it becomes a futile reason for the change.



    I have no idea.. I'm assuming they wouldn't have gone to the effort if it didn't offer significant power and/or size reduction.



    Quote:
    Originally Posted by solipsism View Post


    How much faster GPU performance can we expect over Intel's GMA X4500? What about other the other chipset options like lower-power WiFi, WiMAX, H.264/VC1 encoding/decoding, and other nifty aspects to Montevina?



    If it runs at the level of the 9300M/9400M like nVidia has been claiming, it should be twice as fast or more on 3dMark06. I don't have info off-hand, but I remember reading more than one article that criticized Intel for weak performance on H264/VC-1 decoding versus nVidia (and even more so against ATI's superb implementation). I'm pretty sure I was reading about the X4500.. but possibly could have been X3100. Time to do some more research...



    Quote:
    Originally Posted by solipsism View Post


    I've read plenty of pros regarding GPU performance with NVIDiA, but even GMA950 was never an issue for me, so what potential cons are there in Apple moving to NVIDIA chipsets?[/list]



    Well, many would say that the macbook isn't meant for certain applications, but I'd have to ask if you have used the GMA950 for any type of DCC/video/audio/etc applications? I've read some of Apple's pro apps won't even boot on the X3100!



    Intel has certainly never been popular for their chipsets, but I don't know.. Maybe nVidia's one-chip solution will have bugs, or their drivers will be weak, or it will run hot.. I guess anything is possible, but remember, this is Apple...



    Especially given that they will have a totally new chipset situation when Nehalem rolls around (no FSB and Intel won't give nVidia the IP rights for quickpath), I don't think they would roll this transition out unless they were absolutely certain it is providing an advantage...
  • Reply 65 of 99
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by winterspan View Post


    I have no idea.. I'm assuming they wouldn't have gone to the effort if it didn't offer significant power and/or size reduction.



    Perhaps the flexibility that nVidia is offering. The Intel-Apple relationship seemed a little strained since the MacBook Air launch.



    Quote:
    Originally Posted by winterspan View Post


    If it runs at the level of the 9300M/9400M like nVidia has been claiming, it should be twice as fast or more on 3dMark06. I don't have info off-hand, but I remember reading more than one article that criticized Intel for weak performance on H264/VC-1 decoding versus nVidia (and even more so against ATI's superb implementation). I'm pretty sure I was reading about the X4500.. but possibly could have been X3100. Time to do some more research...



    The X4500 has full decoding thingys (http://www.phoronix.com/scan.php?pag..._x4500hd&num=1).



    Quote:
    Originally Posted by winterspan View Post


    ...I've read some of Apple's pro apps won't even boot on the X3100!...



    Many pro apps



    Quote:
    Originally Posted by winterspan View Post


    Intel has certainly never been popular for their chipsets, but I don't know.. Maybe nVidia's one-chip solution will have bugs, or their drivers will be weak, or it will run hot.. I guess anything is possible, but remember, this is Apple...



    I expect some first revision bugs, probably a few lawsuits (I'm not being flippant here)... A risk that Apple has to take on, but in all seriousness, nVidia does make some kickass stuff, they are a competitor to Intel's chipsets and AMD/ATI's chipsets and GPUs. nVidia has PhysX as well...



    Quote:
    Originally Posted by winterspan View Post


    Especially given that they will have a totally new chipset situation when Nehalem rolls around (no FSB and Intel won't give nVidia the IP rights for quickpath), I don't think they would roll this transition out unless they were absolutely certain it is providing an advantage...



    Something has spooked Apple about Nehalem, if Apple is going to take an at least one year's detour from Intel chipsets...?



    Quote:
    Originally Posted by winterspan View Post


    Everything I have read says the graphics on the macbook are definitely going to be integrated... but based upon the 9300M discrete card or on it's performance level. Oddly enough, there is no "9400" -- nVidia actually uses "9400" to describe the performance specification that is created when you combine the 9100M integrated graphics with a 9300M discrete card....I know, confusing, right? nVidia and their crazy naming schemes... *sigh*...



    Ah... good catch. There is no 9400. So we are looking at Hybrid SLI and Hybrid Power on the MacBooks, according to this line of rumours. An integrated 9100M, and a discrete 9300M G (or 9300M GS -- probably 9300M G). I expect this to bench around 1500 to 2000 3DMark06's.



    Edit:

    The MCP79 is still "secret" so far. So it could be an integrated GPU based on a 9300 or 9400-esque level of performance. It could be 65nm, it could be 55nm... Do we have *any* corroborating evidence on what the heck is in the MCP79...?
  • Reply 66 of 99
    nvidia2008nvidia2008 Posts: 9,262member
    To repeat, there is one more thing Intel doesn't have. Nvidia has it built into the GPU now, even the lowly 9300M G, if I read correctly.



    PhysX.



    Beyond games, what if your MacBook was a nice little high-powered Physics simulator? I mean, it ain't gonna process Hadron collider data or anything, but PhysX acceleration is rather impressive.



    Intel has Havok but it is not accelerated in the X4500 GPU, AFAIK...



    PhysX driver support for Windows should be here around now...

    http://blog.laptopmag.com/nvidia-unv...-support-by-q3



    I'm going to find and watch pretty PhysX demo videos.



    Because if the MacBook can do that, at least initially in BootCamp, I am going to say, OMFG. Remember though, for Windows BootCamp, you have to find the latest drivers yourself and tinker with it a little, as laptop drivers are usually provided through the manufacturer which will choose previous versions which may be more stable, who may not update them as much.



    So. The big question is now back to Apple. What is OpenCL? Will PhysX ever be used in OpenCL or OpenPhysics or something like that?



    Could the GPU really be as powerful as the CPU? HD, H.264 Decode/Encode, Physics simulations, other great apps..?



    If within 10.5.5+ or 10.6 you could access PhysX, and write a really amazing K-12+University physics education program/platform, that would be darn cool.



    Nice PhysX game demo:

    http://www.youtube.com/watch?v=tFzcBbdQEh4
  • Reply 67 of 99
    vineavinea Posts: 5,585member
    Heh, an $800 MB with decent integrated graphics kills the Mini as a product as it is.



    So, we can hopefully expect either a mini rev soon after this launch or a new Macbook for me.



    I have a Rev A MBP so I have zero qualms about a 1st rev MB based on nvidia.



    So how much faster could a new MB be over my 1st gen 17" MBP?
  • Reply 68 of 99
    nvidia2008nvidia2008 Posts: 9,262member
    Another cool PhysX video:

    http://www.youtube.com/watch?v=SS_8VIPGJsY

    http://www.youtube.com/watch?v=sNZbOEPEWFU

    Just a demo, but the finished product, screams "playable on MacBook" all over it? Hmm... interesting possibilities.



    This one is quite interesting, basic demo but fun because of the squashable frogs and bunnies.

    http://www.youtube.com/watch?v=gMQDPLcqt8s
  • Reply 69 of 99
    Quote:

    I do hope you're getting LittleBigPlanet when it comes out this month!



    Nooo, then I wont have any more free time....



    Quote:

    In both the Macbook and the Macbook Pro, we are not talking about the "best". The macbook currently uses the *worst* (mainstream) graphics solution on the market. Going to a new integrated chipset from nVidia is hardly overkill. Many media applications that use CoreImage/OpenGL can't even RUN on a macbook. Literally, it's not about bad performance -- it won't even LOAD.



    True, but you fail to see what I'm trying to say bout the MBP is people is not happy with what the rumored MBP GPU is going to be (its going to be slightly better then the 8600M GT but hey, for me this GPU is good enough for most usage), they want a better and most of them that I've heard and read wants one of the best mobile GPU for MBP, what's their reason? Games....I want to play Crysis.....one of their reason



    I honestly thought Apple will be using Intel Larabee in the future...
  • Reply 70 of 99
    MacProMacPro Posts: 19,821member
    Quote:
    Originally Posted by guinness View Post


    Yeah it is.



    The 8800GT gets about twice as many FPS as an 8600GTS for example.



    http://www.anandtech.com/video/showdoc.aspx?i=3140&p=8



    Off topic sorry : This asked of any of the gamer experts. I have to run both my ACDs on the 8800GT on an 8 core Mac Pro even though I have an additional graphics card as Apple Pro apps refuse to work across two graphics cards correctly. FCPro doesn't allow preview on second monitor if on a second card as just one example. How much hit does the sharing the GT8800 make on the performance when only using one of the ACDs for games?
  • Reply 71 of 99
    Quote:
    Originally Posted by wheelhot View Post


    I honestly thought Apple will be using Intel Larabee in the future...



    They probably will, it is almost a year away.
  • Reply 72 of 99
    matt_smatt_s Posts: 300member
    Quote:
    Originally Posted by solipsism View Post


    Questions and concerns I have about ising NVIDIA over Intel chipsets:
    1. Is NVIDIA's track record in this area strong enough to make it a valid choice for Apple?




    That's the real question here, and thank goodness someone finally asked it.



    Latest Nvidia commentary off the web this morning:



    Quote:

    In July, a red-faced Nvidia announced it would be eating up to $200 million in repair, return and replacement costs because some significant number of its notebook graphics chips made it into the market with a flaw in the die/packaging material that caused a high failure rate. Apple, being an Nvidia customer, immediately asked about the chips in its Macs, and was assured at the time that it was unaffected by the problem processors. But, according to a new customer support post today, Apple did some poking around of its own and found that some MacBook Pros manufactured between May 2007 and September 2008 are indeed harboring vulnerable versions of Nvidia?s GeForce 8600M GT graphics processor, with failure signaled by distorted or absent video. Apple said that MacBook owners so afflicted will be eligible for free repairs within two years of the original date of purchase, and that those who had already paid for such repairs would be reimbursed.



    Despite its troubles (which include a shareholders lawsuit over the handling of the fault chip issue), Nvidia still has at least one loyal fan base, though not one it would want to embrace ? hackers (see ?The Law of Unintended Consequences: a graphic example?). Seems the massively parallel processing capabilities of the graphics chips also lend themselves well to brute-force cracking, and the euphemistically named ?password recovery software? sold by Russian firm Elcomsoft puts that power in the hands of the ill-intentioned masses. The latest warning of the ramifications comes from Global Secure Systems, which says the hardware-software combination renders Wi-Fi?s WPA and WPA2 encryption systems pretty much useless. Using the graphics processors, hackers can break the commonly used wireless encryption schemes 100 times faster than with conventional microprocessors, GSS officials said. ?This breakthrough in brute force decryption of Wi-Fi signals by Elcomsoft confirms our observations that firms can no longer rely on standards-based security to protect their data. As a result, we now advise clients using Wi-Fi in their offices to move on up to a VPN encryption system as well,? said GSS managing director David Hobson. ?Brute force decryption of the WPA and WPA2 systems using parallel processing has been on the theoretical possibilities horizon for some time - and presumably employed by relevant government agencies in extreme situations - but the use of the latest NVidia cards to speedup decryption on a standard PC is extremely worrying.



    At this point, I believe Mac users should be thoughtful, concerned and watchful, rather than giddy, elated and jumping for joy when it comes to Apple's choice of more Nvidia product to roll out new MacBooks and MacBook Pros.
  • Reply 73 of 99
    haggarhaggar Posts: 1,568member
    Quote:
    Originally Posted by winterspan View Post


    "bang for the buck" is not what Quadro cards are about. They are professional grade cards that are optimized for very high-reliability and high-accuracy rendering for CAD/CAM/3D modeling. The drivers are much more rigorously verified and offer many advanced OpenGL features not available in Geforce cards. On the other hand, Geforce cards are optimized for high frame-rates and performance, not accuracy and reliability.



    So your response to why Apple does not offer SLI on the Mac Pro is "Quadro cards are professional cards". How does that address the issue at all?



    Well, if a Quadro card is so great and so professional, then wouldn't 2 Quadro cards running SLI be even better? Does running a pair of Quadro cards in SLI make the Quadro less "professional" somehow? If Apple wants Macs to be the best platform for graphics and design, then shouldn't Macs have the best graphics options available? An Apple logo alone does not make up for subpar hardware.
  • Reply 74 of 99
    tenobelltenobell Posts: 7,014member
    Quote:
    Originally Posted by wheelhot View Post


    Then get a Windows notebook? Its cheap and you get better specs then MB or MBP. Except that you will lose the beauty of OS X but hey, its a compromise.



    Which gaming notebooks are those, that cost less than the MacBook with better specs?
  • Reply 75 of 99
    tenobelltenobell Posts: 7,014member
    Quote:
    Originally Posted by solipsism View Post


    Questions and concerns I have about ising NVIDIA over Intel chipsets:





    [*] How much smaller is this North and Southbridge combined chipset over Intel's offerings? Combining ships into one is great, but if the result is only a couple percent difference it becomes a futile reason for the change.



    [*] How much faster GPU performance can we expect over Intel's GMA X4500? What about other the other chipset options like lower-power WiFi, WiMAX, H.264/VC1 encoding/decoding, and other nifty aspects to Montevina?



    [*] I've read plenty of pros regarding GPU performance with NVIDiA, but even GMA950 was never an issue for me, so what potential cons are there in Apple moving to NVIDIA chipsets?[/list]





    Hopefully Apple has tested the chipsets enough to enusre their durability. If I were to be concerned. I would be less concerned over the machine breaking right when I first bought it and more concerned about it breaking right after my Apple Care expires.



    About your questions on performance. Is their anyway to really know the answers since the chipset hasn't been formally introduced.
  • Reply 76 of 99
    I would trust Nvidia in the new Macs for one simple reason .....



    If they get this wrong, they are in serious, serious trouble.



    This chipset is going to be as important as their chips in the original XBOX, it needs to JUST WORK.



    The solder issue in the exisiting Macbook pro's and other notebooks was an expensive mistake, but it was just one simple thing, and we shouldn't question everything they release based on this one issue.



    Apple generally releases 'appliance' type pc's, limited configurations, extremely limited hardware choices, and no choice of OS. This makes it very similar to an XBOX in many ways, and I think they will get it right, mainly because they have done so before ... and it really isn't that hard to design a solid 'appliance'.



    It's true that Dell and HP only offer one OS choice as well, but that OS (Vista) has to deal with zillions of hardware variations, making it a much less reliable OS overall.



    MB's and MB pro's are like the cell phones of the PC market, and if Nvidia can't get it right (and personally I think they can), their future is in serious question.



    And that alone should give you some extra confidence.
  • Reply 77 of 99
    banchobancho Posts: 1,517member
    I think one of the key points here is that they'll be thoroughly updating the Macbooks and that my current black 1st gen Macbook will be utterly trounced in the graphics department by any of the rumored chipset upgrades. Color me excited. I'll be upgrading for certain this time.
  • Reply 78 of 99
    MarvinMarvin Posts: 15,441moderator
    Quote:
    Originally Posted by nvidia2008 View Post


    Beyond games, what if your MacBook was a nice little high-powered Physics simulator? I mean, it ain't gonna process Hadron collider data or anything, but PhysX acceleration is rather impressive.



    I don't think PhysX is supported on the mobile chips though. The Macbooks will still be low end but just much more usable for the majority of apps.



    Quote:
    Originally Posted by nvidia2008 View Post


    So. The big question is now back to Apple. What is OpenCL? Will PhysX ever be used in OpenCL or OpenPhysics or something like that?



    Could the GPU really be as powerful as the CPU? HD, H.264 Decode/Encode, Physics simulations, other great apps..?



    I think that on the lower end chips, they will be able to maybe half standard encoding times relative to the CPU in the Macbook. It won't be quite what we get from the much higher end chips but still a good improvement.



    Quote:
    Originally Posted by digitalclips


    How much hit does the sharing the GT8800 make on the performance when only using one of the ACDs for games?



    I would say none if it's not drawing updates to it. You could simply unplug one and test the framerate but I reckon you will see no drop in performance.



    Quote:
    Originally Posted by vinea


    Heh, an $800 MB with decent integrated graphics kills the Mini as a product as it is.



    So, we can hopefully expect either a mini rev soon after this launch or a new Macbook for me.



    A Mini with the Nvidia chip based on the 9300/9400 should be enough to stop (my) complaints about the Mini being crippled. I think it has to get Nvidia chips to be OpenCL compatible or it will have to be discontinued. If it does get bumped, the GPU performance will go up by a factor of about 5-10.



    If they have a good GPU at £399, I would quite like to see a SSD option using OCZ drives or at least 7200 rpm drives. The laptop drives kills the Mini performance too. An easier to access design would help greatly. I wish they'd make it so that the lid just flipped up and you simply slot Ram in at the front and the drive would be on a tray that could be pulled out.
  • Reply 79 of 99
    matt_smatt_s Posts: 300member
    Quote:
    Originally Posted by Bancho View Post


    I think one of the key points here is that they'll be thoroughly updating the Macbooks and that my current black 1st gen Macbook will be utterly trounced in the graphics department by any of the rumored chipset upgrades. Color me excited. I'll be upgrading for certain this time.



    Are you unconcerned about the WiFi security holes accessible via the Nvidia devices?
  • Reply 80 of 99
    Quote:
    Originally Posted by digitalclips View Post


    How much hit does the sharing the GT8800 make on the performance when only using one of the ACDs for games?



    Quote:
    Originally Posted by Marvin View Post


    I would say none if it's not drawing updates to it. You could simply unplug one and test the framerate but I reckon you will see no drop in performance.



    Back when the OS desktop was just a 2D image, there was no performance impact from dual monitors, because drawing 2D takes almost zero power from the GPU.



    Now every operating system uses a compositing window manager, so all the stuff on screen takes up video memory, and there are 3D effects... it may have a slight, but noticeable, effect on frame rates. I wouldn't worry, though.
Sign In or Register to comment.