Am I the only one excited about nVidia chipsets?

mavmav
Posted:
in Future Apple Hardware edited January 2014
Hey all, just joined.



I've been waiting for these new macbooks for aaaages now, don't actually have a mac yet.



Anyway I've had many PCs and almost all of them have had nvidia chipsets and graphics.



One thing you all need to understand about nvidia chipsets is, yes they get hot and use bulk power but thats just the extreme high end boards. Something like the GeForce 8200 intergrated chipset uses far less power (it actually uses a little less power than an AMD/ATI 780G which is touted for its awesome power saving).



I've owned an nForce 2 Ultra 400 with an Athlon XP, that is over 5 years old and still running. I currently have a 750i SLI and I have had zero issues with it, I overclock my 1.8ghz core2 duo to 3ghz and I run 2x 9800GTs in SLI. Two of my friends have 650i SLI boards, also no issue....



I have no idea why nvidia chipsets get such a bad reputation.

Comments

  • Reply 1 of 19
    Nvidia's current bad rep comes mainly from the 680i and 780i boards. They overheat to the point of instability, unless you use the whiny little fan they come with. And their Windows drivers have been very hit-or-miss lately.



    I wouldn't worry about an Nvidia chipset Macbook, though. Their IGP and mobile stuff is fine, and have some advantages in being a single-chip solution. Apple would be conservative with the driver in favor of stability.



    I don't necessarily believe this rumor is true, but it is certainly plausible.
  • Reply 2 of 19
    Quote:
    Originally Posted by FuturePastNow View Post


    Nvidia's current bad rep comes mainly from the 680i and 780i boards. They overheat to the point of instability, unless you use the whiny little fan they come with. And their Windows drivers have been very hit-or-miss lately.



    I wouldn't worry about an Nvidia chipset Macbook, though. Their IGP and mobile stuff is fine, and have some advantages in being a single-chip solution. Apple would be conservative with the driver in favor of stability.



    I don't necessarily believe this rumor is true, but it is certainly plausible.



    Given Apple's penchant for going with the cheapest possible LCD, putting GPUs without drivers that even function, etc. in their laptops, I for one am NOT going to be a guinea pig for whatever crap they are putting in the next MBP. I'm sure it will run iLife '08 and Mail, but GOD FORBID you try and run any games or virtual reality stuff or anything written by, oh, someone other than Crapple.



    But I'll believe it when I see it.
  • Reply 3 of 19
    mrshowmrshow Posts: 164member
    Quote:
    Originally Posted by darkgoob View Post


    Given Apple's penchant for going with the cheapest possible LCD, putting GPUs without drivers that even function, etc. in their laptops, I for one am NOT going to be a guinea pig for whatever crap they are putting in the next MBP. I'm sure it will run iLife '08 and Mail, but GOD FORBID you try and run any games or virtual reality stuff or anything written by, oh, someone other than Crapple.



    But I'll believe it when I see it.



    Cheapest possible LCD? GPU's without drivers?



    Not sure what you were using but I ran WoW fine on my MBP before it was stolen.



    Crapple? Really?
  • Reply 4 of 19
    GPU's without drivers. Sounds like intel there build in gups look good on paper but there drivers suck.
  • Reply 5 of 19
    mavmav Posts: 7member
    Quote:
    Originally Posted by FuturePastNow View Post


    Nvidia's current bad rep comes mainly from the 680i and 780i boards. They overheat to the point of instability, unless you use the whiny little fan they come with. And their Windows drivers have been very hit-or-miss lately.



    I wouldn't worry about an Nvidia chipset Macbook, though. Their IGP and mobile stuff is fine, and have some advantages in being a single-chip solution. Apple would be conservative with the driver in favor of stability.



    I don't necessarily believe this rumor is true, but it is certainly plausible.



    I've owned a 680i once as well, sold it a while ago. Never caused issues.



    I've found drivers to be brilliant for nvidia, they have lots of features and never seemed unstable.



    And you raise a good point about single chip design.



    I think all signs point to yes on this rumor though, I mean look how long the monteniva/centrino2 platform been out for, surely it doesn't take them forever to make a metal chassis, they've been waiting on the nvidia chip imo.
  • Reply 7 of 19
    macnubmacnub Posts: 18member
    I am with you, I am a long time pc user, and I have always been a fan of nvidia over ati, and i have always gotten better results with most popular games.
  • Reply 8 of 19
    Quote:
    Originally Posted by MacNub View Post


    I am with you, I am a long time pc user, and I have always been a fan of nvidia over ati, and i have always gotten better results with most popular games.



    Quoting your response to the OP and not just following up the last responder helps in clarification. Take your comment as the follow-up to the previous poster and you'll get my point.
  • Reply 9 of 19
    wizard69wizard69 Posts: 13,377member
    I've noticed a few postings from the PC side of life here and let me tell you guys to not get too wrapped up in drivers when it comes to Macs.



    First Apple is very conservative with respect video drivers, as such you won't see the same performance as you get on the PC side. Second Apple distributes it's driver updates in it's own sweet time, frankly their is not a ravenous interest like their is with PC drivers. Third gaming just isn't a big part of the MacOS world.



    It is certainly interesting to speculate about future Apple hardware but in the end interest in the details of the hardware drops off after the product is delivered. At least it seems that way. Frankly I wouldn't be surprised if they went the AMD route for something like the mini's replacement.





    Dave
  • Reply 10 of 19
    robb01robb01 Posts: 148member
    Im hoping they improve gaming a bit



    _________________



  • Reply 11 of 19
    mavmav Posts: 7member




    Those "crappy drivers" are problems with Vista they don't reflect how drivers will be on a unix based system.



    Overheating, well thats an issue, But the G80, G84 and G86 are 65nm parts.



    If the new macbooks have an nvidia chipset it'll be the 55nm MCP79.



    As an owner of a 9800GT/G92b which is a 55nm chip, I can tell you it runs a good 30 degrees C lower than my mates 8800GT which is a 65nm chip.



    Whatever the issue is with those older dies shouldn't be in the newer ones.
  • Reply 12 of 19
    ksecksec Posts: 1,569member
    Well the sole main purpose of Nvidia Chipset would be CUDA aka OpenCL. ( I expect OpenCL to adopt quite alot from CUDA )



    Which suit Apple really well since the current G92b are very CUDA based. That is why ATI has the edge for gaming right now.



    I hope Nvidia 's PureVideo HD will be better this time around and apple provide some support for it.



    If everything goes well. It could be the best Nvidia product this year, if Tegra doesn't sudden pop out in 08.
  • Reply 13 of 19
    bbwibbwi Posts: 812member
    Quote:
    Originally Posted by Mav View Post




    Whatever the issue is with those older dies shouldn't be in the newer ones.





    Only if they learn from their mistakes
  • Reply 14 of 19
    mavmav Posts: 7member
    Quote:
    Originally Posted by bbwi View Post


    Only if they learn from their mistakes



    Yeah true, but what I'm saying is my 55nm 9800GT runs at about 50 degrees C compared to the 65nm 8800GT at 70-80 degrees C.



    So the improvements in fabrication are already apparent considering the problem was overheating.
  • Reply 15 of 19
    nvidia2008nvidia2008 Posts: 9,262member
    I like nVidia stuff.
  • Reply 16 of 19
    bbwibbwi Posts: 812member
    Quote:
    Originally Posted by Mav View Post


    Yeah true, but what I'm saying is my 55nm 9800GT runs at about 50 degrees C compared to the 65nm 8800GT at 70-80 degrees C.



    So the improvements in fabrication are already apparent considering the problem was overheating.



    By that logic there should have been no heat issues with the 65nm chips because the manufacturing process is better than 90nm



    If you don't put a fan or heat sink on your 55nm chip it will burn up. Heat issues have nothing to do with nanometers and everything to do with heat dissipation.
  • Reply 17 of 19
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by bbwi View Post


    By that logic there should have been no heat issues with the 65nm chips because the manufacturing process is better than 90nm



    If you don't put a fan or heat sink on your 55nm chip it will burn up. Heat issues have nothing to do with nanometers and everything to do with heat dissipation.



    I've only used Nvidia GPUs and Mobos in overclocking scenarios.



    Asus 6600GT : overclocked using Zalman (air)

    No heat issues, got good gaming experience especially Need For Speed: Most Wanted



    Gigabyte 8500GT : factory overclocked, used Zalman aftermarket cooler (air)

    Good stuff



    XpertVision 8500GT : factory overclocked, stock fan

    Good stuff



    Used the two 8500GT's on a nVidia n650i SLI board (Gigabyte DS4)

    Good stuff



    The Nvidia chipset mobo southbridges can get a bit hot, with a small noisy fan, depending on the motherboard manufacturer... That's why always go for a motherboard with passive-cooled (with decent heatpipes etc) southbridge.
  • Reply 18 of 19
    gwmacgwmac Posts: 1,807member
    If Apple uses these new Nvidia chipsets, I am betting that the one they will use has addressed these heat issues. The last thing Apple needs right now is a black eye with overheating Macbooks that would need to be recalled. I really don't think we need to worry about heat.
  • Reply 19 of 19
    This rumor seems very plausible to me. I give it a 50/50 chance of being true.
Sign In or Register to comment.