Apple dumping Intel chipsets for NVIDIA's in new MacBooks

245

Comments

  • Reply 21 of 99
    ivladivlad Posts: 742member
    I can see that happening, just a short thing before Apple make their own Chips.
  • Reply 22 of 99
    Quote:
    Originally Posted by AHeneen View Post


    I'm not a gamer, but why would you need more power than that? Why wouldn't you just get a desktop? If there are more powerful gpu's out there, there is probably a good reason Apple doesn't include one: too much heat would require a larger heatsink and Apple doesn't want to make a thicker laptop, it is not compatible with the current hardware on the MBP, it is too hard/not possible to write drivers for it, etc. At any rate, you can be certain it will not be finding its way into a MacBook!



    That is precisely my point. Those nVidia chips aren't exactly powerhouses, so IF they use 10% or more juice and run noticeably hotter, then they are a bad choice for the MacBook, which should use the most efficient chips available, not the most powerful.
  • Reply 23 of 99
    Quote:
    Originally Posted by solipsism View Post


    If they are hotter then they would need larger heat sinks, fans venting, etc. to dissipate heat, which may mean a thicker case to allow for proper air flow, if a more clever option isn't used.



    It's for this reason that I think the NVIDIA option is still very unlikely over Montevina.



    I don't think that the nVidia integrated chips run too much hotter than the 4500HD. If the temps are close, then I see no reason why a new aluminum MacBook cannot effectively use the nVidia stuff.



    Apple could also include a 2nd fan in the MacBook.
  • Reply 24 of 99
    Quote:
    Originally Posted by applebook View Post


    Those nVidia chips mentioned will still not allow MacBook owners to play recent and demanding games. Even the 8600GT in the MBP cannot play the new stuff without toning down the resolution and details.



    A friend of mine uses the 8800GT and can run Crysis on high.. So a 8600GT must not be too far behind.



    And most games are not that demanding.





    But yeah...
  • Reply 25 of 99
    Quote:
    Originally Posted by Fishyesque View Post


    A friend of mine uses the 8800GT and can run Crysis on high.. So a 8600GT must not be too far behind.



    And most games are not that demanding.





    But yeah...



    Yeah it is.



    The 8800GT gets about twice as many FPS as an 8600GTS for example.



    http://www.anandtech.com/video/showdoc.aspx?i=3140&p=8
  • Reply 26 of 99
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by applebook View Post


    I don't think that the nVidia integrated chips run too much hotter than the 4500HD. If the temps are close, then I see no reason why a new aluminum MacBook cannot effectively use the nVidia stuff.



    Apple could also include a 2nd fan in the MacBook.



    Anccording to AnandTech, X4500HD have a lower TDP, H.264 decode acceleration, 25% more shade processors and a faster clock speed, over X3100.



    I'm not discounting the possibility that NVIDIA has a better option, I just saying that I haven't seen any proof that would make the switch a viable option. You know I love verifiable stats so please post them if you have them. For example, with the PPC to Intel switch we knew PPC wasn't increasing fast enough to be viable in the PC market and that Intel's roadmap was very sound.



    And a 2nd fan means more space and more power to run a fan that is trying to cool a chip that is already using more power than other options that means a larger battery or less overal battery usage. None of that sound like a reason for me to switch.
  • Reply 27 of 99
    Quote:
    Originally Posted by guinness View Post


    Yeah it is.



    The 8800GT gets about twice as many FPS as an 8600GTS for example.



    http://www.anandtech.com/video/showdoc.aspx?i=3140&p=8



    Ohhhhh right right right. When building my friends computer we got him a 8800 not a 8600, but I thought it was the other way around and had to double check.



    *is still excited for Tuesday*
  • Reply 28 of 99
    So if Apple starts using NVIDA chipsets does that mean that they can single handedly prevent AMD from going under and Intel having near monopolistic control of the industry.
  • Reply 29 of 99
    Quote:
    Originally Posted by AHeneen View Post


    I'm not a gamer, but why would you need more power than that? Why wouldn't you just get a desktop? If there are more powerful gpu's out there, there is probably a good reason Apple doesn't include one: too much heat would require a larger heatsink and Apple doesn't want to make a thicker laptop, it is not compatible with the current hardware on the MBP, it is too hard/not possible to write drivers for it, etc. At any rate, you can be certain it will not be finding its way into a MacBook!



    Perhaps because most people don't want to buy a laptop AND a desktop. Also, we are talking about the Macbook PRO, as in PROFESSIONAL APPLICATIONS. Graphics acceleration is beneficial to MUCH more than just gaming, particularly with professional media applications that use OpenGL/CoreImage for transforms and rendering...



    Quote:
    Originally Posted by matt_s View Post


    Here's hoping this chipset doesn't end up with the same issues as NVIDIA has in the MacBook Pro models.



    I've already commented on this, but I wouldn't be afraid of problems with Nvidia's chipsets. They had a manufacturing issue with solder that would fail at high heat levels and have since switched to a different material without the problems. The units that are having problems are months to years old. Especially given their $250 million write-down for repairs, I'm sure they've made sure everything is working correctly.



    Quote:
    Originally Posted by FuturePastNow View Post


    I wonder if the Pro, since it has a discrete GPU, will still use an Intel chipset (PM45)? I bet it will. Also:

    I doubt Apple is going to use any of those things. The Drivecache/Turbo Memory/Readyboost/Hybrid Hard Drive were a gimmick that no one uses, and Hybrid SLI requires a discrete GPU in addition to the IGP anyway.



    Hopefully not. The nVidia chipset will be great, especially if Apple enables "Hybrid power". This will allow the Macbook Pro to shut off the discrete GPU and use the low-power integrated graphics on the motherboard when not under heavy load, thus saving a lot of power!
  • Reply 30 of 99
    Quote:
    Originally Posted by BB Sting View Post


    So if Apple starts using NVIDA chipsets does that mean that they can single handedly prevent AMD from going under and Intel having near monopolistic control of the industry.



    Correct me if I'm wrong, but AMD is in the CPU business, not the graphics accelerator cards or other higgledy-piggledy (sorry if I'm getting too technical ) business.
  • Reply 31 of 99
    Quote:
    Originally Posted by solipsism View Post


    I'm not discounting the possibility that NVIDIA has a better option, I just saying that I haven't seen any proof that would make the switch a viable option. You know I love verifiable stats so please post them if you have them. For example, with the PPC to Intel switch we knew PPC wasn't increasing fast enough to be viable in the PC market and that Intel's roadmap was very sound.

    And a 2nd fan means more space and more power to run a fan that is trying to cool a chip that is already using more power than other options that means a larger battery or less overal battery usage. None of that sound like a reason for me to switch.



    The "proof of viability" should be the vastly superior graphics performance and video decoding. Nvidia and AMD's implementations of OpenGL/DirectX10/H264 and VC1 video decoding are much superior to Intel's GMA, even the latest version.



    Quote:
    Originally Posted by solipsism View Post


    If they are hotter then they would need larger heat sinks, fans venting, etc. to dissipate heat, which may mean a thicker case to allow for proper air flow, if a more clever option isn't used. It's for this reason that I think the NVIDIA option is still very unlikely over Montevina.



    Intel's GMA has never been known to be incredibly efficient, and it really depends on what process technology Nvidia is making the chipsets on. Also, remember their chipset is a SINGLE chip solution with a combined northbridge/southbridge which should result in a lower TDP.



    Quote:
    Originally Posted by iVlad View Post


    I can see that happening, just a short thing before Apple make their own Chips.



    Highly unlikely. Developing modern, efficient and capable graphics chipsets is obviously not a simple task as shown by the continual struggle of Intel.
  • Reply 32 of 99
    I smell MXM on the horizon. Cross your fingers.



    C
  • Reply 33 of 99
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by winterspan View Post


    They had a manufacturing issue with solder that would fail at high heat levels and have since switched to a different material without the problems. The units that are having problems are months to years old. Especially given their $250 million write-down for repairs, I'm sure they've made sure everything is working correctly.



    I'm still a bit weary and will be holding off my purchase for a new Mac until I'm certain this is no longer an issue... assuming they do move to NVIDiA for their chipsets, which I still think is is unlikely.



    Quote:

    Hopefully not. The nVidia chipset will be great, especially if Apple enables "Hybrid power". This will allow the Macbook Pro to shut off the discrete GPU and use the low-power integrated graphics on the motherboard when not under heavy load, thus saving a lot of power!



    Isn't this was OpenCL was going to address in SL?
  • Reply 34 of 99
    solipsismsolipsism Posts: 25,726member
    Quote:
    Originally Posted by winterspan View Post


    Intel's GMA has never been known to be incredibly efficient, and it really depends on what process technology Nvidia is making the chipsets on. Also, remember their chipset is a SINGLE chip solution with a combined northbridge/southbridge which should result in a lower TDP.



    Then that is excellent news! AnandTech has X4500HD as using a 12.5TDP. If they can lower that plus shrink the rest of the chip size needed than i can see why Apple would consider that route.
  • Reply 35 of 99
    dr_lhadr_lha Posts: 236member
    Quote:
    Originally Posted by SpamSandwich View Post


    Correct me if I'm wrong, but AMD is in the CPU business, not the graphics accelerator cards or other higgledy-piggledy (sorry if I'm getting too technical ) business.



    AMD owns ATI, so they are definitely in the graphics accelerator business.
  • Reply 36 of 99
    The Geforce 9100M is the currently known integrated/motherboard graphics chipset for nVidia laptop motherboards. According to the information known, the macbook 'MCP79' board will have a new integrated graphics chipset based on some version of their discrete Geforce 9300M/9400M cards, but it will depend on the specific motherboard model that Apple uses. If I were to guess, I'd bet the performance will be a bit better than the existing "9300M G" (which would be nearly 2X as fast as the latest Intel GMA). Hopefully the Macbook Pro also uses an nVidia board and Apple implements their "Hybrid Power" technology. It will allow the Macbook Pro to turn off it's discrete GPU when not in heavy use and use the integrated chipset instead to save power.



    I put together this graphic just to server as a rough performance comparison between nVidia's latest mobile GPU generation.



    notes:

    - On the graph I included the 8600GT which is the current Macbook Pro card to show as comparison. I believe the current Macbook uses the GMA X3100.

    - SP stands for "Stream Processors" (shaders) which are the SIMD processing units that make up a nVidia's modern GPUs.

    - Nvidia GPUs have three separate clock frequencies -- one each for the main clock, shader processor clock, and memory clock. On the graph, only the shader clock is listed.

    - 3DMark05 and 3DMark06 are standard graphics performance benchmarks.

    - I did the best I could and double-checked the data, but I can not make *ANY GUARANTEES* about the accuracy of the information on the chart.



  • Reply 37 of 99
    nvidia2008nvidia2008 Posts: 9,262member
    Guys, for mobile gaming, 9300 on a 13", and a 9600 on a 15", is pretty darn good already.



    The 9600 of course cannot compete with desktop 8800s but a 9600 with 512MB VRAM is pretty darn good. Medium settings for the latest games, yes, but it can move decent frame rates with 4xAA.



    If you can run C&C3 on medium settings, that is, close to the ATI 2400 in iMacs, on the MacBook 9300 13", that's already darn good.



    Throw in OpenCL, etc. and the nVidia move makes sense.



    Apple just doesn't like "Centrino" because every man and his dog makes laptops off that. Apple is tired of competing on specs with such a wide range of competitors, IMO.



    Can you imagine, for the first time, GRAPHICS AND GAMING on the MACBOOK is going to be an Apple Advantage, if this is all true! This is HUGE!



    Anyone that wants more than a 9600 mobile chip, well, there's always desktops with an 8800GT, 8800GT SLI, 9800, and GTX 260 or GTX 280 overclocked...
  • Reply 38 of 99
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by BB Sting View Post


    So if Apple starts using NVIDA chipsets does that mean that they can single handedly prevent AMD from going under and Intel having near monopolistic control of the industry.



    AMD has been somewhat f*ked for a few years now. There is increasing sell-off to Middle East oil interests.
  • Reply 39 of 99
    Thanks. Is that data from NotebookCheck? There is some variance in the data, since 3DMark06, especially, is influenced strongly by RAM and CPU. Nonetheless, thanks for the info.



    IMO, the most important thing to look at is the 3DMark06 score. I know it's not perfect, but in my view it is the most relevant in modern graphic card comparisons.



    Let's take the 8600M GT. By mobile gaming standards, it is really not bad, and pushes 3000 3DMark06. The 9600M GT, goes up to 5000 3DMark06s. That's really pretty impressive for mobile. If you've got Hybrid mode, then that reduces power drain in MBPs a lot.



    The 9300, while not great, does 1000 to 2000 3DMark06s. Which means reasonable, acceptable performance on Medium settings for Mac games, and Low-to-Medium settings for latest hardcore 3D Windows games. That really ain't that bad.



    Could Apple actually be reversing its haughty stance on games? With the iPhone and iPod become game platforms in and of themselves, now portable Macs??? *shocked*



    Quote:
    Originally Posted by winterspan View Post


    The Geforce 9100M is the currently known integrated/motherboard graphics chipset for nVidia laptop motherboards. According to the information known, the macbook 'MCP79' board will have a new integrated graphics chipset based on some version of their discrete Geforce 9300M/9400M cards, but it will depend on the specific motherboard model that Apple uses. If I were to guess, I'd bet the performance will be a bit better than the existing "9300M G". Hopefully the Macbook Pro also uses an nVidia board and Apple implements their "Hybrid Power" technology. It will allow the Macbook Pro to turn off it's discrete GPU when not in heavy use and use the integrated chipset instead to save power.



    I put together this graphic just to server as a rough performance comparison between nVidia's latest mobile GPU generation.



    notes:

    - On the graph I included the 8600GT which is the current Macbook Pro card to show as comparison. I believe the current Macbook uses the GMA X3100.

    - SP stands for "Stream Processors" (shaders) which are the SIMD processing units that make up a nVidia's modern GPUs.

    - Nvidia GPUs have three separate clock frequencies -- one each for the main clock, shader processor clock, and memory clock. On the graph, only the shader clock is listed.

    - 3DMark05 and 3DMark06 are standard graphics performance benchmarks.

    - I did the best I could and double-checked the data, but I can not make *ANY GUARANTEES* about the accuracy of the information on the chart.







  • Reply 40 of 99
    Quote:
    Originally Posted by solipsism View Post


    Isn't this was OpenCL was going to address in SL?



    I'm not sure if this was going to be a capability of OpenCL or not. If I had to guess, I'd say no considering OpenCL is at the OS layer, rather than the low-level hardware/BIOS level to be able to allow this functionality. Almost always, laptop chipsets/motherboards have come either with integrated graphics *OR* a discrete card, not both. The capability to switch back and forth was recently developed by both AMD and Nvidia, and requires explicit motherboard support.

    However, assuming Apple implements this capability, OpenCL should be able to use both the integrated graphics or the discrete card since they both use the same general architecture/driver model/virtual instruction set/etc.
Sign In or Register to comment.