or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Ati X1600 Gpu
New Posts  All Forums:Forum Nav:

Ati X1600 Gpu

post #1 of 34
Thread Starter 
So the new GPU in the intel-based iMac and MacBook is pretty speedy. I expect it is one of the reasons that the new systems play HD video so well. I bet it also produces nice framerates for gamers, etc.

But my question relates to the future availability of this graphics card as well as the price of existing upgrades, namely the X800. I have a Rev A G5 tower with a 9800 Pro card and I'd consider upgrading to a more powerful card. I wonder if the X800 will drop in price or if the X1600 will be available for older AGP-class machines. ATI also demoed the X1800 at MacWorld, so I expect it will play in to the mix too.

So what are your thoughts? And what's your preferred place to buy such upgrades?
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
post #2 of 34
ATI says something about support of AGP 8x here. I don't know though what is this AGP-PCI-E external bridge chip. But I have my doubts about actual support of older AGP Macs.
post #3 of 34
Thread Starter 
Quote:
Originally posted by PB
ATI says something about support of AGP 8x here. I don't know though what is this AGP-PCI-E external bridge chip. But I have my doubts about actual support of older AGP Macs.

Interesting... I wonder what that's all about?! And I agree, I expect AGP support will be dropped for all future cards because nobody will stick some cutting edge GPU on a motherboard with slow connectivity and a slow processor.

Which is why the X800 is probably a good enough upgrade for my G5 until I replace the machine, I just wish the X800 was cheaper.
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
post #4 of 34
Thread Starter 
Any reason I wouldn't want to get this OEM card from Newegg for $320? Otherwise the lowest price I've seen so far is about $375 from buy.com.
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
post #5 of 34
hi xool. i was going to strongly suggest the nvidia 7800GT 256mb (just $300+), it will whip the x800 pretty well off the screen in OpenGL stuff... then i realised it be PCI-E though otherwise for $300 some outstanding value there.

it's a real pity the nvidia 6600GT 256mb ($150) is not really available for AGP Macs. good luck anyway...!
post #6 of 34
Quote:
Originally posted by Xool
So the new GPU in the intel-based iMac and MacBook is pretty speedy. I expect it is one of the reasons that the new systems play HD video so well. I bet it also produces nice framerates for gamers, etc.

But my question relates to the future availability of this graphics card as well as the price of existing upgrades, namely the X800. I have a Rev A G5 tower with a 9800 Pro card and I'd consider upgrading to a more powerful card. I wonder if the X800 will drop in price or if the X1600 will be available for older AGP-class machines. ATI also demoed the X1800 at MacWorld, so I expect it will play in to the mix too.

So what are your thoughts? And what's your preferred place to buy such upgrades?

yes, hardware (GPU) h.264 decoding might already be in place, so those ati cards are great for that.

i'd have to say though that the x1600 and x1800 actually does poorer at open gl compared to similarly priced nvidia cards. given macs use a lot of open gl and not direct x the x1600 is nice but i have to point out this fact for awareness' sake.

what would be killer though is if the HARDWARE h.264 ENCODING in x1600/ x1800 is unleashed in the intel macs sometime this year. that would be awesome.
http://episteme.arstechnica.com/grou...r/870005927731
post #7 of 34
Quote:
Originally posted by Xool
Any reason I wouldn't want to get this OEM card from Newegg for $320? Otherwise the lowest price I've seen so far is about $375 from buy.com.

One word: wait. If you don't really need now a fast card for advanced graphics stuff, just wait. And keep an eye on the prices and availability.
post #8 of 34
Thread Starter 
Quote:
Originally posted by PB
One word: wait. If you don't really need now a fast card for advanced graphics stuff, just wait. And keep an eye on the prices and availability.

Yeah I know I should wait until the prices drop another round... doh.

Meanwhile I also have an ADC-based 20" Cinema Display and I eventually want to upgrade it too. If I get a MacBook I might switch to a new 23", but I have the feeling that I can resell this display for more if I bundle it with the G5. No? Maybe not...

Although if I later replace the tower I'd probably want the latest screen then too if there's a new model. Argh!
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
post #9 of 34
VRAM 128 v 256?

Do you think the 256MB VRAM offers a worthwhile improvement with this card over the 128? I'm not much of a gamer but I am inrigued by the possibilities with CoreImage.

Some considerations to think of . . the X1600 only has a 128 bit memory bus, not 256 (that sometimes limits cards from really exploiting more VRAM); also -- and this is probably only a consideration for gaming -- but only five vertex shaders for a 12 pipeline card, so once again a potential bottleneck preventing it from really making more than 128mb vram worthwhile. On the other hand, a new generation chip with apparently fast GDDR3 memory.

If anyone can help me out here . . I am trying to decide between the 1.67/128VRAM Macbook and the 1.83/256. I think the upgrade to the 1.83 is a bit overpriced, but maybe worth it if the processor and VRAM improvements will add to the usability of the computer in a significant way.
post #10 of 34
Quote:
Originally posted by PB
One word: wait. If you don't really need now a fast card for advanced graphics stuff, just wait. And keep an eye on the prices and availability.

Thanks for that sensible note. I have been thinking about the x800 for my G5.
post #11 of 34
Originally posted by NordicMan
Thanks for that sensible note. I have been thinking about the x800 for my G5.



that would be a sweet upgrade. but for USD $400+ ..!
post #12 of 34
Quote:
Originally posted by photoeditor
VRAM 128 v 256?

Do you think the 256MB VRAM offers a worthwhile improvement with this card over the 128? I'm not much of a gamer but I am inrigued by the possibilities with CoreImage.

Some considerations to think of . . the X1600 only has a 128 bit memory bus, not 256 (that sometimes limits cards from really exploiting more VRAM); also -- and this is probably only a consideration for gaming -- but only five vertex shaders for a 12 pipeline card, so once again a potential bottleneck preventing it from really making more than 128mb vram worthwhile. On the other hand, a new generation chip with apparently fast GDDR3 memory.

If anyone can help me out here . . I am trying to decide between the 1.67/128VRAM Macbook and the 1.83/256. I think the upgrade to the 1.83 is a bit overpriced, but maybe worth it if the processor and VRAM improvements will add to the usability of the computer in a significant way.


you have to mention to us what you apps you plan to use your macbook pro for most of the time, some of the time, once in a while.... that way discussing 1.67/128 vs 1.83/256 will make more sense

also on that note one could say that the 1.83mhz core duo is clocked 10% faster than 1.67mhz core duo. but could one also say that the 1.83mhz core duo is 20% faster than the 1.67mhz core duo since each core is 10% faster than 1.67???!! (yes, only for apps which make full use of both cores, etc, etc.)
post #13 of 34
On the applications, I am not much of a gamer.

Where I really expect to use it is with Core Image. I have not sprung for Aperture so far, but I have high hopes that Apple will get it right well within the lifetime of this computer. Likewise, iPhoto looks like it could be useful to me now. More generally, I tend to work with a lot of windows open at once, and that includes working with photographs of course, and that really does bog down what I have now (Radeon 9000 64MB)

On the other hand, I just can't see Adobe retooling Photoshop on the Mac for Core Image unless they can do something similar and without too much duplication of effort for DirectX 9 on the PC.

As regards Core Video, that one is less likely. I don't own a camcorder and don't expect to get one in the near future, and I certainly don't see myself as a candidate for Final Cut Studio or anything like that.

Finally, there is the question of dual display use. It is fairly likely that within the lifetime of the machine I would simply no longer use my desktop for anything except as a server and I would then use the laptop in a dual display setup with the main window on the monitor and the palettes on the laptop. But there is no chance of my getting the 30 inch display; 23 would be the largest possibility for that.
post #14 of 34
Quote:
Originally posted by photoeditor
More generally, I tend to work with a lot of windows open at once, and that includes working with photographs of course, and that really does bog down what I have now (Radeon 9000 64MB)

On the other hand, I just can't see Adobe retooling Photoshop on the Mac for Core Image unless they can do something similar and without too much duplication of effort for DirectX 9 on the PC.

As regards Core Video, that one is less likely. I don't own a camcorder and don't expect to get one in the near future, and I certainly don't see myself as a candidate for Final Cut Studio or anything like that.

Finally, there is the question of dual display use. It is fairly likely that within the lifetime of the machine I would simply no longer use my desktop for anything except as a server and I would then use the laptop in a dual display setup with the main window on the monitor and the palettes on the laptop. But there is no chance of my getting the 30 inch display; 23 would be the largest possibility for that.

It seems to me that you would be better off with the 256 MB GPU, especially in the long term. I see though that there is no option to update the 1.67 GHz model from 128 to 256 MB, so you will need to go with the considerably more expensive 1.83 GHz model.
post #15 of 34
Quote:
Originally posted by sunilraman
you have to mention to us what you apps you plan to use your macbook pro for most of the time, some of the time, once in a while.... that way discussing 1.67/128 vs 1.83/256 will make more sense

also on that note one could say that the 1.83mhz core duo is clocked 10% faster than 1.67mhz core duo. but could one also say that the 1.83mhz core duo is 20% faster than the 1.67mhz core duo since each core is 10% faster than 1.67???!! (yes, only for apps which make full use of both cores, etc, etc.)

This reminds me of a story I read today about someone buying kegs of beer that were 10% off. He bought two, and the clerk multiplied 10% by 2 and gave him 20% off... of course this makes you wonder what would have happened if he had bought ten kegs.

Even if you're going to include both cores, it's still 10%, not 20%.
post #16 of 34
Thread Starter 
Quote:
Originally posted by PB
It seems to me that you would be better off with the 256 MB GPU, especially in the long term. I see though that there is no option to update the 1.67 GHz model from 128 to 256 MB, so you will need to go with the considerably more expensive 1.83 GHz model.

Agreed. Although if you're a user who so deeply needs that 256 MB of VRam compared to the 128 MB, I'd think you'd value the 10% CPU boost as well. Don't forget the 120 GB HD upgrade!
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
post #17 of 34
Quote:
Originally posted by sunilraman
yes, hardware (GPU) h.264 decoding might already be in place, so those ati cards are great for that.

i'd have to say though that the x1600 and x1800 actually does poorer at open gl compared to similarly priced nvidia cards. given macs use a lot of open gl and not direct x the x1600 is nice but i have to point out this fact for awareness' sake.

what would be killer though is if the HARDWARE h.264 ENCODING in x1600/ x1800 is unleashed in the intel macs sometime this year. that would be awesome.
http://episteme.arstechnica.com/grou...r/870005927731

From what I read from the PC review sites, all x1x00 GPU from ati are able to encode h.264 via hardware. It has to be activated by the driver though. The first ATI driver that activated this feature was Catalyst 5.12 which during benchmarking showed about 50% less cpu usage compared with older driver when encoding h.264.

Supposedly, Nvidia claims that their current generation GPU can also do hardware h.264 encoding as well with a future driver release. So, if you already own Nvidia 6x00 or 7x00 video cards, it might be worth holding on to them.

Atleaset, all the information above is for the PC side. I'm not sure when these vendors or apple will use full use of the GPU for h.264, but it can happen with next driver update.
always a newbie
Reply
always a newbie
Reply
post #18 of 34
Quote:
Originally posted by bitemymac
From what I read from the PC review sites, all x1x00 GPU from ati are able to encode h.264 via hardware. It has to be activated by the driver though. The first ATI driver that activated this feature was Catalyst 5.12 which during benchmarking showed about 50% less cpu usage compared with older driver when encoding h.264.

The 5.13 driver enabled hardware decoding of h.264. Hardware enabled encoding isn't active yet. They did release a utility that was significantly more optimised than many encoders out there but that's software optimisations.
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
post #19 of 34
Originally posted by Duckspeak
This reminds me of a story I read today about someone buying kegs of beer that were 10% off. He bought two, and the clerk multiplied 10% by 2 and gave him 20% off... of course this makes you wonder what would have happened if he had bought ten kegs.

Even if you're going to include both cores, it's still 10%, not 20%.



heh. my mistake. looks like 10% it is on the whole.
post #20 of 34
Quote:
Originally posted by photoeditor
VRAM 128 v 256?

Do you think the 256MB VRAM offers a worthwhile improvement with this card over the 128? I'm not much of a gamer but I am inrigued by the possibilities with CoreImage.

Some considerations to think of . . the X1600 only has a 128 bit memory bus, not 256 (that sometimes limits cards from really exploiting more VRAM); also -- and this is probably only a consideration for gaming -- but only five vertex shaders for a 12 pipeline card, so once again a potential bottleneck preventing it from really making more than 128mb vram worthwhile. On the other hand, a new generation chip with apparently fast GDDR3 memory.

If anyone can help me out here . . I am trying to decide between the 1.67/128VRAM Macbook and the 1.83/256. I think the upgrade to the 1.83 is a bit overpriced, but maybe worth it if the processor and VRAM improvements will add to the usability of the computer in a significant way.

having more ram means never having to say your sorry =)
post #21 of 34
Thread Starter 
Quote:
Originally posted by iggypopped
having more ram means never having to say your sorry =)

"You need more RAM" is the only thing my Grandma knows.
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
post #22 of 34
For the record, 10.4.4 on the new Intel macs does use hardware assisted h.264 decoding. No word if the encoding is enabled yet, but no doubt it will be.
post #23 of 34
Thread Starter 
And now the X1900 is introduced...

Hopefully the switch to Intel will help availability of future Mac graphics card upgrades. Using standard PC cards would be awesome.

At any rate, I hope the X1600 or something is released for Macs. Either it will push the X800 prices down or will kick ass enough so that its worth paying for.

I feel dirty paying $400 for the X800.

However, since my G5 has AGP, my future upgrade options are starting to look limited. Good thing the machine is 2.5 years old.
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
post #24 of 34
Hardware ENCODING isn't going to happen on GPU's. Encoding is so branch dependent that it'd likely be slower on the GPU than the CPU. GPU's do a small amount of tasks very well, and that's it.

A seperate H264 encoding card for people working with video? Maybe.
post #25 of 34
Quote:
Originally posted by gregmightdothat
Hardware ENCODING isn't going to happen on GPU's. Encoding is so branch dependent that it'd likely be slower on the GPU than the CPU. GPU's do a small amount of tasks very well, and that's it.

A seperate H264 encoding card for people working with video? Maybe.

So it won't happen on hardware but you want a separate hardware card for it? Can you follow that logic because I can't.

ATI has said GPU assisted encoding is coming and supported by current hardware and I am inclined to believe them. If branches were too large a problem then the netburst cores would be dreadful at it.
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
post #26 of 34
hey has anyone tried flashing a 6600GT to use in the pci-express powermac g5s? just curious, since 6600 and 6600LE's are used in pci-express powermac g5s, if i am not mistaken.
post #27 of 34
okay, while on the topic, i realised my 6600gt can do shader model 3.0 and HDR. but someone mentioned that it can't do HDR and antialiasing at the same time? or something like that? anyway, got 1753 3dmark06 points with my pc rig. sweet

edit: some discussions here
http://www.beyond3d.com/forum/showthread.php?t=26960
http://www.hardforum.com/showthread....oto=nextoldest

it seems that doing "true" HDR (high dynamic range) lighting (the FP16 kind )+ FSAA (full screen antialiasing) at the same time is not possible on the nVidia 6600GT, or even the 7-series cards.

it seems like a complex issue though because i definitely get what looks like HDR and some antialiasing on my 6600GT. but then apparently there's MSAA and SSAA (multisample and subsample antialiasing?) and what not. phew. brain in knots right now.

can some l337 haXX0r translate the following into english?

"Nvidia can do already HDR with AA, their ROPs are not limited to AA function. AA can be done in PS as well (Nalu Demo!). So if the devs use this "path" with their FP16 HDR, then the NV40/G70 will have HDR+AA. When a dev uses just the "quick way" thru OpenExr then AA isn´t possible on NV40/G70. I personally don´t think it should be all that hard to fully enable OpenExr with AA, but we will see that when the G71 will be out. I bet the final specs and features will be released, I mean leaked around the end of Feb, if not earlier."
post #28 of 34
Quote:
Originally posted by sunilraman
"Nvidia can do already HDR with AA, their ROPs are not limited to AA function. AA can be done in PS as well (Nalu Demo!). So if the devs use this "path" with their FP16 HDR, then the NV40/G70 will have HDR+AA. When a dev uses just the "quick way" thru OpenExr then AA isn�t possible on NV40/G70. I personally don�t think it should be all that hard to fully enable OpenExr with AA, but we will see that when the G71 will be out. I bet the final specs and features will be released, I mean leaked around the end of Feb, if not earlier."

Lemme see if I can translate. HDR means, obviously, "hard drive," and AA stands for "amino acid." Me, I wasn't personally aware that biochemistry was extending into the data storage industry, but there it is, spelled out. ROP is RighteOus Points.

Everybody knows that PS stands for "post script," and Nalu Demo is the Cuban mathematician/poet who first coined this novel way of extending a letter beyond its logical stopping point. The next line I'm not sure about, but I think that a dev is some kind of personal grooming device, "path" is slang for "exfoliant," and NV40/G70 is a character from Star Wars.

The rest makes perfect sense given these interpretations.
post #29 of 34
Sorry, I was out late last night watching jazz...
post #30 of 34
I'm still trying to work out if watching jazz was a good thing or a bad thing for you
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
"When I was a kid, my favourite relative was Uncle Caveman. After school, wed all go play in his cave, and every once and awhile, hed eat one of us. It wasnt until later that I discovered Uncle...
Reply
post #31 of 34
Quote:
Originally posted by gregmightdothat
Hardware ENCODING isn't going to happen on GPU's.

Apparently it is going to, at least partially. It hasn't been released yet, but it is stated to be coming (for the X1000-series GPUs). Encoding and transcoding. Of course having chips that support it, and having Mac drivers that actually use the feature, are two different things.

Quote:
Originally posted by gregmightdothat

Encoding is so branch dependent that it'd likely be slower on the GPU than the CPU. GPU's do a small amount of tasks very well, and that's it.

It looks like ATI expects to be able to use the GPU to make the process substantially faster, by taking the things like in-loop deblocking, motion compensation and inverse transform and shunting them to the GPU. Clever Canadians. It should be interesting; however, it's already late (going by their original launch estimates), so who knows when it'll really arrive.
post #32 of 34
duckspeak did you have one too many beers at the jazz thing?

anyway, yeah, export to Quicktime h.264 on mac hardware accelerated if you have a x1000+ card would be sweet.

would also be good on the core-duo-powered iHome which we know is coming \
post #33 of 34
Quote:
Originally posted by Duckspeak
Lemme see if I can translate. HDR means, obviously, "hard drive," and AA stands for "amino acid." Me, I wasn't personally aware that biochemistry was extending into the data storage industry, but there it is, spelled out. ROP is RighteOus Points.

Everybody knows that PS stands for "post script," and Nalu Demo is the Cuban mathematician/poet who first coined this novel way of extending a letter beyond its logical stopping point. The next line I'm not sure about, but I think that a dev is some kind of personal grooming device, "path" is slang for "exfoliant," and NV40/G70 is a character from Star Wars.

The rest makes perfect sense given these interpretations.

Right on the money except ROPS stands for Roll Over Protection System. I know cause I just put one on my tractor
-Voice or no voice, the people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked, and denounce the pacifists for lack of...
Reply
-Voice or no voice, the people can always be brought to the bidding of the leaders. That is easy. All you have to do is tell them they are being attacked, and denounce the pacifists for lack of...
Reply
post #34 of 34
Thread Starter 
Damn you ATI, hurry up and release better cards! I want to upgrade my G5 and i don't want to settle for the X800! I don't mind paying top dollar for the latest card, but $400 for an X800 is ridiculous! Arg!
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
Download BARTsmart BART Widget, the best BART schedule widget for Mac OS X's Dashboard.
Reply
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Ati X1600 Gpu