Nvidia 9800GX2 - Damn it Apple!
It's not that new of news, but it's news, and it pisses me off! A few days ago Toms hardware scored some exclusive PIctures of the Nvidia 9800GX2 which is basically 2x Nvidia G92 GPU's SLI'd together on one card. It's about 30% faster than the 8800 Ultra so reports say. I don't even want to think about it. Other than hoping Apple offers one before my Mac Ships so I can change my order. \
Comments
Although I can not figure out why they are calling it 9800GX2. It really should be more like 8950GX2 because the GPU is not a new generation. It shouldn't be bumped to 9xxx.
http://uk.youtube.com/watch?v=OAYg8bzehPQ
Given that it took Apple more than a year to get the 8800 out, we might see support for this in 2009.
I wonder if GPUs will start to go the same way as CPUs. Instead of faster, just put in more than one.
I'll probably have to get XP to play crisis. Because I hear it's dead weight in Vista.
If you're gonna play games you need XP. Vista runs games significantly worse.
If you're gonna play games you need XP. Vista runs games significantly worse.
But XP does not have DX10
so there is no need to worry about Vista yet.
But XP does not have DX10
Vista/DirectX10 is not required to get all the high detail settings in Crysis working:
http://arstechnica.com/journals/thum...sis-demo-on-xp
http://kotaku.com/gaming/crysis/crys...-xp-316543.php
at the moment.. all games can run on XP.
so there is no need to worry about Vista yet.
Well, apart from Halo 2 which requires Vista for no good reason, other than Microsoft hoping people would be willing to upgrade to Vista so they could play a 3 year old Xbox title.
If Apple sees that graphics are important to us there is also a chance of better graphics options in more machines than just the Mac Pro in the future.
http://www.apple.com/feedback/macpro.html
Although I can not figure out why they are calling it 9800GX2. It really should be more like 8950GX2 because the GPU is not a new generation. It shouldn't be bumped to 9xxx.
I always get a chuckle out of GPU naming schemes.
I haven't had a clue since it got more complicated than "rage" and "rage pro"
Hey I dont understand why you can put more then 1 ATI (base model) GPU in the MacPro? Where as you cant with the other GPU's?
Which is better? Double or 4X ATI (I hope I read it correctly) compare to 1X 8800GT?
I always get a chuckle out of GPU naming schemes.
I haven't had a clue since it got more complicated than "rage" and "rage pro"
It's confusing to say the least, but the schemes usually go like this:
For ATI, they use a versioning model of 1000 number increments. An example would be the
Radeon 9000 series. They made the 9800, which was the fastest chipset of that generation using the 800 in that to designate the highest model. So if the 9800 was the fastest of that chipset, the 9600 would be the mainstream card of the same chip architecture.
Then once they research and come up with a brand new chip design, they go up 1000. And since 9 was the last one they could use on that numbering system, they went to X800. With the X300 being the budget card. Then X1800, X1300 being the budget card.
Currently they're on the X3000 series. So X3800 is the lower high end, X3600 is the mainstream, and X3200 is the budget.
The increments of the hundreds indicate if the product is high end, mainstream, or budget. These are almost always 900, 800, 600, and 300 when it comes to ATI. 800, is the lower high end. 600 is the mainstream, and 300 is the budget.
The 9 you see in the X1900 indicates the ultra high end.
For nVidia, it's similar, but not exactly the same.
The competition for ATI's X100 series was the 6000 series. then for the X1000 it was the 7000.
The newest Nvidia series is the 8000 series. which competes with the X2000 series. Nvidia hasn't released their competition to the X3000 series yet. But even now the 8800 is holding up against the newer X3800.
As far as the whole GTX, GTS, XT, labeling goes, that's a confusing mess that seems to change with every new release.
.......................
The newest Nvidia series is the 8000 series. which competes with the X2000 series. Nvidia hasn't released their competition to the X3000 series yet. But even now the 8800 is holding up against the newer X3800.
As far as the whole GTX, GTS, XT, labeling goes, that's a confusing mess that seems to change with every new release.
Holding up? It's out performing it.
Anyway, I think you mean the GTX, GTS, GT. One would think that it would go Fast, GT. Faster GTS, Fastest, GTX, but it's not like that.
A MacPro wouldnt have problem running Crysis in Vista .
Hey I dont understand why you can put more then 1 ATI (base model) GPU in the MacPro? Where as you cant with the other GPU's?
Which is better? Double or 4X ATI (I hope I read it correctly) compare to 1X 8800GT?
Putting multiple graphics cards in a Mac Pro serves only one purpose: to drive multiple monitors.
Four ATI cards is just that, four ATI cards. It doesn't make them any more powerful.
Four ATI cards is just that, four ATI cards. It doesn't make them any more powerful.
But wouldnt 4 GPU works faster in rendering games for example compared to 1 GPU? ( Sorry, for asking this question, I always wondered how this more then 1 GPU works)
But wouldnt 4 GPU works faster in rendering games for example compared to 1 GPU? ( Sorry, for asking this question, I always wondered how this more then 1 GPU works)
The potential 4 GPUs in a MacPro are plugged into 4 different monitors, so each one produces the image for the display it is connected to. In the SLI configuration (of which the Mac has never had one as far as I know) two GPUs divide up the pixels on a single screen and thus accomplish the task of drawing that one screen in potentially as little as half the time.
Frankly though, the SLI configs are a tiny part of the PC market and aren't really worth the huge heat/power cost of having two GPUs. Depending on the application/game they might not improve the framerate... and its even possible that they make it worse in some unusual cases.
ETA June? In time for Nehalem Mac Pros? And new displays..?
And a sexy case redesign?
WWDC?
We shall see.
Lemon Bon Bon.
The potential 4 GPUs in a MacPro are plugged into 4 different monitors, so each one produces the image for the display it is connected to. In the SLI configuration (of which the Mac has never had one as far as I know) two GPUs divide up the pixels on a single screen and thus accomplish the task of drawing that one screen in potentially as little as half the time.
Frankly though, the SLI configs are a tiny part of the PC market and aren't really worth the huge heat/power cost of having two GPUs. Depending on the application/game they might not improve the framerate... and its even possible that they make it worse in some unusual cases.
It was rare even when SLI first started making it's way into the main stream, but I think all new games are optimized for SLI so the chances of that ever happening again are slim. But the solution in those few games was simple. Just tun off one GPU in your preferences.
Apple should offer it or the new Ati Card
Maybe, lets hope ATI make better drivers this time and wont repeat their past mistake (the sudden freezing issue)
It was rare even when SLI first started making it's way into the main stream, but I think all new games are optimized for SLI so the chances of that ever happening again are slim. But the solution in those few games was simple. Just tun off one GPU in your preferences.
LOL... all new games optimized for SLI? Only in nVidia's dreams.
You're probably right that the SLI-all-on-one-card devices have likely eliminated the cases where it runs slower. It used to be that the CPU had to submit everything twice and transfer data across the bus twice, but now both GPUs likely read the same data.