Next-gen Mac graphics hardware roundup
AMD, Intel, and NVIDIA have all in the past several days launched a new generation of video hardware, some of which will undoubtedly make its way into future Macs. Here's a brief run-through of the chipsets and what they bring to the table.
The completed update means that system builders -- including Apple -- have the theoretical go-ahead to overhaul the graphics processor (GPU) choices in their desktops and laptops, regardless of the design or the price point.
A common thread for all three companies' GPUs is a unified architecture for shaders, the instruction sets that help render special effects in 3D software. Until recent announcements, most 3D graphics were painted with separate pixel (texturing) and vertex (3D geometry) shaders that limited what developers could achieve. Unifying the hardware which accelerates this code makes for more flexible, and typically better, visual effects in games or in professional modeling software.
Beyond this development, virtually every chip also offers explicit decoding for HD-capable video formats, such as H.264 (used in iTunes, Quicktime, and Blu-Ray movies) or VC-1 (used for HD DVDs). With supporting software, the chips can offload most, if not all, of the workload from the main processor -- a critical feature for portables, which often don't have either the performance or battery life to decode very large HD videos using CPU power alone.
It's unclear as to just how much of this technology Apple will choose to implement as its various Mac models receive updates in the coming months. The company has become increasingly dependent on 3D hardware for driving the Quartz Extreme visual layer for Mac OS X as well as pro apps such as Color or Motion in Final Cut Studio 2. However, the company also has a history of cautious or sometimes nonexistent support for card-specific features, having for years left out movie processing that had long since been implemented in Windows.
No matter what Apple may put into its drivers, though, the company is bound to gain a major increase in speed using some of the new hardware, highlights of which are featured below.
ATI Radeon HD and Mobility Radeon HD
Perhaps the largest revamp played out on Monday when AMD launched its ATI Radeon HD series. Most of the line relies on a unified shader architecture, including its desktop parts (which NVIDIA had updated as early as November).
The top card is the Radeon HD 2900 XT -- which some readers may recognize as the renaming of the X2800 that surfaced in February. Beyond its role as the fastest 3D chip in AMD's collection with 512MB of RAM, the 2900 is also the technology showcase with the ability to pass through 5.1 audio when given an HDMI output. Apple still uses its predecessor, the X1900 XT, as a build-to-order option for the Mac Pro workstation.
ATI Radeon HD 2900 XT
A step down, the Radeon HD 2600, keeps the features but cuts the speed back and halves memory to 256MB. The card is said by AMD to be a better fit for smaller computers and (on slower models) will run without a fan. Where the 2900 consumes 215W of power and fills two graphics card slots, the 2600 is half the size and needs only 45W to run properly. Its closest parallel in today's Macs is the X1600, which has been used in 17- and 20-inch iMacs ever since their transition to Intel processors in January 2006.
AMD's portable graphics, the Mobility Radeon, also saw the jump to an HD version and has many of the same feature updates but with the lower performance and better energy management needed in the smaller enclosures. Only one chipset, the Mobility Radeon HD 2600, has a direct ancestor in the Mac. Apple currently uses the Mobility Radeon X1600 as the sole dedicated mobile graphics choice for its MacBook Pro laptops.
Both desktop and laptop lines also have lower-end HD 2300 and 2400 versions that act as basic improvements over normally slow integrated graphics. The Xeon-based Xserve currently uses the earlier desktop Radeon X1300.
NVIDIA GeForce 8, 8M series
ATI's frequent challenger before the AMD merger, NVIDIA, has rolled out its line much more gradually: its first unified shader cards appeared much sooner, in November of last year, and were followed by mid-range cards earlier this spring.
The newest and most relevant GPUs from the Santa Clara-based firm are the GeForce 8500 and 8600. Essentially pared-down versions of the GeForce 8800 that has so far gone unused in Macs, they roughly equate to AMD's Radeon HD 2600 mid-range cards with the main differences between 8500 and 8600 being the number of shader processors. The 8500 is effectively a direct substitute for the GeForce 7300 used in the 24-inch iMac and as the stock video selection for the Mac Pro; the faster 8600 is built as a direct replacement for the GeForce 7600, which only exists for Apple as an option in the 24-inch iMac.
NVIDIA GeForce 8600
Like AMD, NVIDIA also has a mobile equivalent, the GeForce 8M. It too brings unified shaders into the mobile realm and is split into high-performance (8600M) and budget (8400M) versions determined by their processors. Unlike its choice of AMD, however, Apple's use of mobile GeForce parts has been near-absent since the 12-inch PowerBook G4 was retired in 2006. The only portable NVIDIA chip known to exist in shipping Apple hardware, in fact, is the GeForce Go 7300 used inside the Apple TV.
Intel X3100
If Apple continues its current practice of relying on integrated graphics for the MacBook, the company will still see a graphics upgrade if and when it decides to switch the notebook to Intel's Santa Rosa mobile platform.
The GMA X3100 represents what many consider a much-needed upgrade for what has often been labeled a slow component. It brings full hardware transformation and lighting for 3D geometry -- a core feature present in some home user video cards since 2001, but non-existent for Intel's hardware (including the just-replaced GMA 950) until this month.
In addition to the unified shader model and HD video acceleration, the X3100 is also the first Intel part to offer native HDMI output.
The completed update means that system builders -- including Apple -- have the theoretical go-ahead to overhaul the graphics processor (GPU) choices in their desktops and laptops, regardless of the design or the price point.
A common thread for all three companies' GPUs is a unified architecture for shaders, the instruction sets that help render special effects in 3D software. Until recent announcements, most 3D graphics were painted with separate pixel (texturing) and vertex (3D geometry) shaders that limited what developers could achieve. Unifying the hardware which accelerates this code makes for more flexible, and typically better, visual effects in games or in professional modeling software.
Beyond this development, virtually every chip also offers explicit decoding for HD-capable video formats, such as H.264 (used in iTunes, Quicktime, and Blu-Ray movies) or VC-1 (used for HD DVDs). With supporting software, the chips can offload most, if not all, of the workload from the main processor -- a critical feature for portables, which often don't have either the performance or battery life to decode very large HD videos using CPU power alone.
It's unclear as to just how much of this technology Apple will choose to implement as its various Mac models receive updates in the coming months. The company has become increasingly dependent on 3D hardware for driving the Quartz Extreme visual layer for Mac OS X as well as pro apps such as Color or Motion in Final Cut Studio 2. However, the company also has a history of cautious or sometimes nonexistent support for card-specific features, having for years left out movie processing that had long since been implemented in Windows.
No matter what Apple may put into its drivers, though, the company is bound to gain a major increase in speed using some of the new hardware, highlights of which are featured below.
ATI Radeon HD and Mobility Radeon HD
Perhaps the largest revamp played out on Monday when AMD launched its ATI Radeon HD series. Most of the line relies on a unified shader architecture, including its desktop parts (which NVIDIA had updated as early as November).
The top card is the Radeon HD 2900 XT -- which some readers may recognize as the renaming of the X2800 that surfaced in February. Beyond its role as the fastest 3D chip in AMD's collection with 512MB of RAM, the 2900 is also the technology showcase with the ability to pass through 5.1 audio when given an HDMI output. Apple still uses its predecessor, the X1900 XT, as a build-to-order option for the Mac Pro workstation.
ATI Radeon HD 2900 XT
A step down, the Radeon HD 2600, keeps the features but cuts the speed back and halves memory to 256MB. The card is said by AMD to be a better fit for smaller computers and (on slower models) will run without a fan. Where the 2900 consumes 215W of power and fills two graphics card slots, the 2600 is half the size and needs only 45W to run properly. Its closest parallel in today's Macs is the X1600, which has been used in 17- and 20-inch iMacs ever since their transition to Intel processors in January 2006.
AMD's portable graphics, the Mobility Radeon, also saw the jump to an HD version and has many of the same feature updates but with the lower performance and better energy management needed in the smaller enclosures. Only one chipset, the Mobility Radeon HD 2600, has a direct ancestor in the Mac. Apple currently uses the Mobility Radeon X1600 as the sole dedicated mobile graphics choice for its MacBook Pro laptops.
Both desktop and laptop lines also have lower-end HD 2300 and 2400 versions that act as basic improvements over normally slow integrated graphics. The Xeon-based Xserve currently uses the earlier desktop Radeon X1300.
NVIDIA GeForce 8, 8M series
ATI's frequent challenger before the AMD merger, NVIDIA, has rolled out its line much more gradually: its first unified shader cards appeared much sooner, in November of last year, and were followed by mid-range cards earlier this spring.
The newest and most relevant GPUs from the Santa Clara-based firm are the GeForce 8500 and 8600. Essentially pared-down versions of the GeForce 8800 that has so far gone unused in Macs, they roughly equate to AMD's Radeon HD 2600 mid-range cards with the main differences between 8500 and 8600 being the number of shader processors. The 8500 is effectively a direct substitute for the GeForce 7300 used in the 24-inch iMac and as the stock video selection for the Mac Pro; the faster 8600 is built as a direct replacement for the GeForce 7600, which only exists for Apple as an option in the 24-inch iMac.
NVIDIA GeForce 8600
Like AMD, NVIDIA also has a mobile equivalent, the GeForce 8M. It too brings unified shaders into the mobile realm and is split into high-performance (8600M) and budget (8400M) versions determined by their processors. Unlike its choice of AMD, however, Apple's use of mobile GeForce parts has been near-absent since the 12-inch PowerBook G4 was retired in 2006. The only portable NVIDIA chip known to exist in shipping Apple hardware, in fact, is the GeForce Go 7300 used inside the Apple TV.
Intel X3100
If Apple continues its current practice of relying on integrated graphics for the MacBook, the company will still see a graphics upgrade if and when it decides to switch the notebook to Intel's Santa Rosa mobile platform.
The GMA X3100 represents what many consider a much-needed upgrade for what has often been labeled a slow component. It brings full hardware transformation and lighting for 3D geometry -- a core feature present in some home user video cards since 2001, but non-existent for Intel's hardware (including the just-replaced GMA 950) until this month.
In addition to the unified shader model and HD video acceleration, the X3100 is also the first Intel part to offer native HDMI output.
Comments
I saw the elgato USB hardware h.264 but from what everything I've seen the MAX output is 800x600 and just short of the 852x480 (iirc) resolution needed to transcode DVD content as faithfully as possible.
Anyone following this arena?
D
So the statement "virtually every chip also offers explicit decoding for HD-capable video formats, such as H.264 (used in iTunes, Quicktime, and Blu-Ray movies) or VC-1 (used for HD DVDs)." should more accurately read, "virtually every chip also offers explicit decoding for HD-capable video formats, such as H.264 (used in iTunes, Quicktime, Blu-Ray, and HD DVD movies) or VC-1 (used in Windows Media Player, Blu-Ray, and HD DVDs)."
"
^ Ha! I had a cat named Booga.
I must admit that I am always confused by GPUs. I don't know what's better or why, I can't tell from the specs and I don't follow this like some folks (gamers in particular) do. So just tell me: what would be the best card for the MBP if you were designing the next model, and how does it stack up to the current one? I always hear complaining about the graphics options in Apple's "Pro" series computers but I wish I knew what it was all about...
^ Ha! I had a cat named Booga.
The Geforce 8600m series looks really impressive.
I'm really hoping for an 8800 option from apple on the mac pro, which uses much less power than the x2 series from ati. Just a choice between the 2 would be nice for once.
All I want are Mac drivers for my SLi 8800GTXXX cards.
a new mac pro with a new chip set or a desktop mac with 2 pci x16 slots and a sli chip set is needed.
Sounds to me like GPU makers need to go the other way and maybe include more GPU units in a card with more pipes but running a much lower speeds.
The only Mac with a real nice GPU is the Mac Pro, desktop replacement laptops lag far behind in my opinion, yet those laptop users need just as much from a GPU.
I am more than happy to lose expandability and large storage for the mobility, but why should I give up on raw power? Why should my rendering be much lower? Why should the GPU performance get me killed in the game?
I dont have time to read that so is there anything in the way of getting even 128MB grapics card in the macbook? I mean when apple update the macbook will there be any chance of having a grapics card in it, anything better than the current, intergrated 64MB...
as you don't have time to read, Ill make it short ..... NO.
This is crazy. That is a lot of juice and a lot of heat.
Sounds to me like GPU makers need to go the other way and maybe include more GPU units in a card with more pipes but running a much lower speeds.
The only Mac with a real nice GPU is the Mac Pro, desktop replacement laptops lag far behind in my opinion, yet those laptop users need just as much from a GPU.
I am more than happy to lose expandability and large storage for the mobility, but why should I give up on raw power? Why should my rendering be much lower? Why should the GPU performance get me killed in the game?
Not to mention that all the reviews that I have read of these cards say that they are loud, loud, loud...
We need a smiley with earplugs.
The Geforce 8600m series looks really impressive.
How do these compare to NVidia's Quadro FX Mobile or the ATI Mobility FireGL series?
How do these compare to NVidia's Quadro FX Mobile or the ATI Mobility FireGL series?
Different types of cards for different uses.
a new mac pro with a new chip set or a desktop mac with 2 pci x16 slots and a sli chip set is needed.
The Mac Pro is still behind in many ways and Graphics is the #1 area that needs attention.
dell and other vendors have so many choices, both when you build a desktop and a laptop: i.e 256MB NVIDIA® GeForce? Go 7900 GS for laptops and the 768MB NVIDIA 8800 GTX for the desktops. those have been available for order ever since they were released.
this article is pretty pointless since apple is always behind other PC vendors when it comes to upgrades. you can read the same stuff @ tomshardware or any other hardware website
But we should still keep up on it, since Apple does update graphics... every once in a while... Perhaps the MBP will get an upgrade since they're about to refresh it. And with the recent updates on the MB, and rumored updates on the iMac, I think they might try and separate their pro machines from the others a bit with such an upgrade... I hope. <_<
Both H.264 and VC-1 codecs are available formats in both Blu-Ray and HD DVD. VC-1 is obviously also used in Windows Media Player, since it's just a formalization of that codec.
So the statement "virtually every chip also offers explicit decoding for HD-capable video formats, such as H.264 (used in iTunes, Quicktime, and Blu-Ray movies) or VC-1 (used for HD DVDs)." should more accurately read, "virtually every chip also offers explicit decoding for HD-capable video formats, such as H.264 (used in iTunes, Quicktime, Blu-Ray, and HD DVD movies) or VC-1 (used in Windows Media Player, Blu-Ray, and HD DVDs)."
Just logged for that reason only: WTF with that incorrect info they put on there? Is there some kind of bias against HD-DVD here, or what? I'm extremely confused. Why would they put that info wrong? And why don't they correct it?
Just logged for that reason only: WTF with that incorrect info they put on there? Is there some kind of bias against HD-DVD here, or what? I'm extremely confused. Why would they put that info wrong? And why don't they correct it?
As a matter of fact the AI site owners are indeed the founders of the Blu-Ray corporation and Kasper is indeed the author of the patent behind Blu-Ray.... And the members of this site (minus 1) have pooled all of their liquid assets to short any stock associated with HD-DVD...
So? What's yer point?
D
As a matter of fact the AI site owners are indeed the founders of the Blu-Ray corporation and Kasper is indeed the author of the patent behind Blu-Ray.... And the members of this site (minus 1) have pooled all of their liquid assets to short any stock associated with HD-DVD...
So? What's yer point?
D
Ok, maybe it sounded a little bit exaggerated, but hey, Apple is going Blu Ray so maybe there's some kind of bias pro-blu ray in an apple news web site. It sounds totally logical in my point of view. Still don't know why they haven't corrected it :-(
Edit:
Well... going back to the subject. Answering some questions about performance. In Dx10 means it seems that ATi should be faster than the nVidia part, but it's still unproven (not enough tests can be done on Dx10 yet to be able to be sure about it) so, taking into consideration that this generation of nVidia video cards consume a lot less power (need smaller fans -> they are less noisy -> Steve likes quiet computers) I'm prety sure that Jobs will go for the nVidia video cards this time. Although ATi has another premium: it has a built-in sound processing chip in the video card so you can have HDMI taking both video AND audio signals (like it's supposed to be) unlike today's video cards that only transmit video.
Ok, maybe it sounded a little bit exaggerated, but hey, Apple is going Blu Ray so maybe there's some kind of bias pro-blu ray in an apple news web site. It sounds totally logical in my point of view. Still don't know why they haven't corrected it :-(
I was just havin some fun... no harm/ill-will was intended.
As for this site being pro-BR simply because of Apple backing BR... I wouldn't go that far... We have a pretty heated thread that seems to have a fair number of proponents on both sides of the Hi-Def DVD fence...
http://forums.appleinsider.com/showthread.php?t=69710
Cheers!
Dave