Nvidia 9800GX2 - Damn it Apple!

Posted:
in Future Apple Hardware edited January 2014
It's not that new of news, but it's news, and it pisses me off! A few days ago Toms hardware scored some exclusive PIctures of the Nvidia 9800GX2 which is basically 2x Nvidia G92 GPU's SLI'd together on one card. It's about 30% faster than the 8800 Ultra so reports say. I don't even want to think about it. Other than hoping Apple offers one before my Mac Ships so I can change my order. \
«134

Comments

  • Reply 1 of 75
    onlookeronlooker Posts: 5,252member
    One good thing right now is that the new Mac Pro has PCI-E 2.0. There is some hope.

    Although I can not figure out why they are calling it 9800GX2. It really should be more like 8950GX2 because the GPU is not a new generation. It shouldn't be bumped to 9xxx.
  • Reply 2 of 75
    MarvinMarvin Posts: 15,326moderator
    Hardly anybody needs faster than the 8800 right now other than to say they have the fastest going. If Crysis plays like this on the Mac Pro, it really just isn't necessary to get something faster:



    http://uk.youtube.com/watch?v=OAYg8bzehPQ



    Given that it took Apple more than a year to get the 8800 out, we might see support for this in 2009.



    I wonder if GPUs will start to go the same way as CPUs. Instead of faster, just put in more than one.
  • Reply 3 of 75
    onlookeronlooker Posts: 5,252member
    I'll probably have to get XP to play crisis. Because I hear it's dead weight in Vista.
  • Reply 4 of 75
    Quote:
    Originally Posted by onlooker View Post


    I'll probably have to get XP to play crisis. Because I hear it's dead weight in Vista.



    If you're gonna play games you need XP. Vista runs games significantly worse.
  • Reply 5 of 75
    onlookeronlooker Posts: 5,252member
    Quote:
    Originally Posted by CoreyMac View Post


    If you're gonna play games you need XP. Vista runs games significantly worse.



    But XP does not have DX10
  • Reply 6 of 75
    at the moment.. all games can run on XP.

    so there is no need to worry about Vista yet.
  • Reply 7 of 75
    yamayama Posts: 427member
    Quote:
    Originally Posted by onlooker View Post


    But XP does not have DX10



    Vista/DirectX10 is not required to get all the high detail settings in Crysis working:



    http://arstechnica.com/journals/thum...sis-demo-on-xp



    http://kotaku.com/gaming/crysis/crys...-xp-316543.php
  • Reply 8 of 75
    yamayama Posts: 427member
    Quote:
    Originally Posted by Name101 View Post


    at the moment.. all games can run on XP.

    so there is no need to worry about Vista yet.



    Well, apart from Halo 2 which requires Vista for no good reason, other than Microsoft hoping people would be willing to upgrade to Vista so they could play a 3 year old Xbox title.
  • Reply 9 of 75
    onlookeronlooker Posts: 5,252member
    Please ask Apple to enable SLI in it's drivers, and sell an optional SLI bridge with the 8800 GT upgrade kit. Even if your not interested in SLI' it's important that Mac users have the same benefits that PC users have available to them.

    If Apple sees that graphics are important to us there is also a chance of better graphics options in more machines than just the Mac Pro in the future.



    http://www.apple.com/feedback/macpro.html
  • Reply 10 of 75
    flounderflounder Posts: 2,674member
    Quote:
    Originally Posted by onlooker View Post


    Although I can not figure out why they are calling it 9800GX2. It really should be more like 8950GX2 because the GPU is not a new generation. It shouldn't be bumped to 9xxx.







    I always get a chuckle out of GPU naming schemes.



    I haven't had a clue since it got more complicated than "rage" and "rage pro"
  • Reply 11 of 75
    A MacPro wouldnt have problem running Crysis in Vista .



    Hey I dont understand why you can put more then 1 ATI (base model) GPU in the MacPro? Where as you cant with the other GPU's?



    Which is better? Double or 4X ATI (I hope I read it correctly) compare to 1X 8800GT?
  • Reply 12 of 75
    Quote:
    Originally Posted by Flounder View Post






    I always get a chuckle out of GPU naming schemes.



    I haven't had a clue since it got more complicated than "rage" and "rage pro"



    It's confusing to say the least, but the schemes usually go like this:



    For ATI, they use a versioning model of 1000 number increments. An example would be the



    Radeon 9000 series. They made the 9800, which was the fastest chipset of that generation using the 800 in that to designate the highest model. So if the 9800 was the fastest of that chipset, the 9600 would be the mainstream card of the same chip architecture.



    Then once they research and come up with a brand new chip design, they go up 1000. And since 9 was the last one they could use on that numbering system, they went to X800. With the X300 being the budget card. Then X1800, X1300 being the budget card.



    Currently they're on the X3000 series. So X3800 is the lower high end, X3600 is the mainstream, and X3200 is the budget.



    The increments of the hundreds indicate if the product is high end, mainstream, or budget. These are almost always 900, 800, 600, and 300 when it comes to ATI. 800, is the lower high end. 600 is the mainstream, and 300 is the budget.



    The 9 you see in the X1900 indicates the ultra high end.





    For nVidia, it's similar, but not exactly the same.



    The competition for ATI's X100 series was the 6000 series. then for the X1000 it was the 7000.



    The newest Nvidia series is the 8000 series. which competes with the X2000 series. Nvidia hasn't released their competition to the X3000 series yet. But even now the 8800 is holding up against the newer X3800.



    As far as the whole GTX, GTS, XT, labeling goes, that's a confusing mess that seems to change with every new release.
  • Reply 13 of 75
    onlookeronlooker Posts: 5,252member
    Quote:
    Originally Posted by Synthetic Frost View Post


    .......................



    The newest Nvidia series is the 8000 series. which competes with the X2000 series. Nvidia hasn't released their competition to the X3000 series yet. But even now the 8800 is holding up against the newer X3800.



    As far as the whole GTX, GTS, XT, labeling goes, that's a confusing mess that seems to change with every new release.



    Holding up? It's out performing it.



    Anyway, I think you mean the GTX, GTS, GT. One would think that it would go Fast, GT. Faster GTS, Fastest, GTX, but it's not like that.
  • Reply 14 of 75
    Quote:
    Originally Posted by wheelhot View Post


    A MacPro wouldnt have problem running Crysis in Vista .



    Hey I dont understand why you can put more then 1 ATI (base model) GPU in the MacPro? Where as you cant with the other GPU's?



    Which is better? Double or 4X ATI (I hope I read it correctly) compare to 1X 8800GT?



    Putting multiple graphics cards in a Mac Pro serves only one purpose: to drive multiple monitors.



    Four ATI cards is just that, four ATI cards. It doesn't make them any more powerful.
  • Reply 15 of 75
    Quote:

    Four ATI cards is just that, four ATI cards. It doesn't make them any more powerful.



    But wouldnt 4 GPU works faster in rendering games for example compared to 1 GPU? ( Sorry, for asking this question, I always wondered how this more then 1 GPU works)
  • Reply 16 of 75
    programmerprogrammer Posts: 3,458member
    Quote:
    Originally Posted by wheelhot View Post


    But wouldnt 4 GPU works faster in rendering games for example compared to 1 GPU? ( Sorry, for asking this question, I always wondered how this more then 1 GPU works)



    The potential 4 GPUs in a MacPro are plugged into 4 different monitors, so each one produces the image for the display it is connected to. In the SLI configuration (of which the Mac has never had one as far as I know) two GPUs divide up the pixels on a single screen and thus accomplish the task of drawing that one screen in potentially as little as half the time.



    Frankly though, the SLI configs are a tiny part of the PC market and aren't really worth the huge heat/power cost of having two GPUs. Depending on the application/game they might not improve the framerate... and its even possible that they make it worse in some unusual cases.
  • Reply 17 of 75
    Well, Apple should offer it or the new Ati Card. See latest PC websites for details. But the true non-die shrink next gen' will be G100 and Ati R700?



    ETA June? In time for Nehalem Mac Pros? And new displays..?



    And a sexy case redesign?



    WWDC?



    We shall see.



    Lemon Bon Bon.
  • Reply 18 of 75
    onlookeronlooker Posts: 5,252member
    Quote:
    Originally Posted by Programmer View Post


    The potential 4 GPUs in a MacPro are plugged into 4 different monitors, so each one produces the image for the display it is connected to. In the SLI configuration (of which the Mac has never had one as far as I know) two GPUs divide up the pixels on a single screen and thus accomplish the task of drawing that one screen in potentially as little as half the time.



    Frankly though, the SLI configs are a tiny part of the PC market and aren't really worth the huge heat/power cost of having two GPUs. Depending on the application/game they might not improve the framerate... and its even possible that they make it worse in some unusual cases.



    It was rare even when SLI first started making it's way into the main stream, but I think all new games are optimized for SLI so the chances of that ever happening again are slim. But the solution in those few games was simple. Just tun off one GPU in your preferences.
  • Reply 19 of 75
    Quote:

    Apple should offer it or the new Ati Card



    Maybe, lets hope ATI make better drivers this time and wont repeat their past mistake (the sudden freezing issue)
  • Reply 20 of 75
    Quote:
    Originally Posted by onlooker View Post


    It was rare even when SLI first started making it's way into the main stream, but I think all new games are optimized for SLI so the chances of that ever happening again are slim. But the solution in those few games was simple. Just tun off one GPU in your preferences.



    LOL... all new games optimized for SLI? Only in nVidia's dreams.





    You're probably right that the SLI-all-on-one-card devices have likely eliminated the cases where it runs slower. It used to be that the CPU had to submit everything twice and transfer data across the bus twice, but now both GPUs likely read the same data.
Sign In or Register to comment.