[QUOTE]Originally posted by iMacfan
But surely a 9800 is better than a 7900?!?!? ... Man, these stupid names make even the 'Model year' sound sensible... David
GPUs have almost a cult-like following in the computer gaming and hardware enthusiast scene. Placebo was being a smartass, it's useful to ignore him somtimes
1. Intel Integrated Graphics is fine for almost any tasks, except for playing 3D games. Regardless of whatever marketing terms and numbers Intel says about it. Puzzle and flash-type games no problem, but forget about any of the current, 1 year old, or upcoming 3D games, Windows or Mac.
2. nVidia GPUs. These are quite easy to understand. There is the 6 series (http://www.nvidia.com/page/geforce6.html
) and the 7 series (http://www.nvidia.com/page/geforce7.html
). 7 series is newer than the 6 series. For each series, generally the higher number models are better. Eg. in the 6 series a 6600 is better than the 6150. Further reading is required for letters after the numbers, eg. 6600GT is better than 6600 which is better than 6600LE. Now for the 7 series it is similar, in that 7600 is better than 7300; 7950 is better than the 7900 which is better than the 7600 and 7300. Making sense so far? Now again, the letter designations are also important, the 7900GTX is better than the 7900GT.
3. nVidia GPUs. Overlap between 6 series and 7 series: Note that at some level the high-6 series can outperform the lower-7 series. This requires further reading. Personally, for example, I think the nVidia 7300 is a total waste of money - it has some newer "features" but is not fast enough to make the most of these features. AFAIK you are better off with a 6600GT.
4. Okay. ATI GPUs. The old designations made sense in that higher numbers are better than lower numbers. radeon 9800 is better than 9600 which is better than the now hopelessly outdated 9200. Actually, these old designations are all outdated now, in general. Don't even worry about them. ATI has now moved on to the X1000 series designations. Similarly, X1900 is better than X1800 is better than X1600 is better than X1300. Now, for a given number, further reading is required. From top down, for the X1900, we have X1900 XTX, X1900 XT, X1900 GT.
5. ATI vs nVidia. This is an area that DEFINITELY requires further reading to make sense. for example, the ATI X1600 comes in around the nVidia 6600/6800 range. At the top end, the nVidia 7900 battles it out with the ATI X1900.
6. GPU clocking. Clockspeeds also matter within a certain range of cards. The iMac and MacBookPro sports an ATI X1600 and ATI x1600 Mobility. These run at slower clockspeeds, which results in poorer performance in 3DMark05, a highly regarded GPU benchmarking tool, when compared to the ATI X1600 normal factory settings:
7. Various sites such as tomshardware.com, anandtech.com and GPU enthusiast sites and forums will have further info on a whole range of benchmarks for different games and the quality of visuals you get with different cards. For example, a new feature is HDR (High Dynamic Range) lighting, which is no doubt commonly discussed in recent GPU forums.
8. What Apple has chosen. Mac mini, MacBook has Intel Integrated Graphics. The iMac and MacBookPro has ATI X1600 and X1600 Mobility as per the graph above. For the Powermac G5, it comes with the nVidia 6600LE and 6600, which ironically, has performance roughly around the ATI X1600 in the iMac and MacBookPro, depending on how you adjust the clockspeed of the X1600 when running Windows Bootcamp. Thankfully though the Powermac G5 has the option of a nVidia 7800GT which is a pretty decent card. The nVidia 7800GT is roughly 1.8x faster than the iMac Core Duo's ATI X1600. If the ATI X1600 is clocked at factory settings in Windows Bootcamp, then the nVidia 7800GT is 1.5x faster than the iMac Core Duo. These estimates are based on 3DMark05.
9. nVidia Quadro - These are not for gaming as much, they are targeted towards people that do 3D graphics work.