or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Intel unleashes Mac-bound "Woodcrest" server chip
New Posts  All Forums:Forum Nav:

Intel unleashes Mac-bound "Woodcrest" server chip - Page 7

post #241 of 566
Well, to clarify. Firstly, I took numbers for 3DMark05 to get comparisons with the X1600.

The 7900GT and 7900GTX difference in 3DMark05, when compared to say the 7600GT, does get bigger the higher the resolution you go, eg. 1024x768 going up to 1600x1200 or something like that.

I took 3Dmark05 numbers of 1024x768 with no antialiasing, no anisotropic filtering, simply as a simple comparison.

So yeah, it does look weird in the numbers I put out comparing the 7900GT and 7900GTX compared to the 7600GT.

The first thing is what you refer to as the "sucker tax". Yes, gaming enthusiasts are willing to put in the extra cash to get the latest and greatest. A 7600GT to them seems mid-range when if they have the cash they want to get a 7900GT or 7900GTX to get the best available.

Secondly, there are other aspects outside of 3DMark05 to consider when actually playing games. Various things such as 4x AntiAliasing, 16x Anisotropic Filtering, HighDynamicRange, all at smooth, playable frame rates, are more likely in the 7900GT and 7900GTX. Think of it as having greater "headroom" and higher overall visual quality when actually playing games, looking beyond the 3DMark05.

The 7600GT is capable of a lot of the latest features, but again, think of the 7900s as having greater "headroom" with all these features at higher resolutions, running smoother.

For example, my 6600GT in HalfLife2-Episode1 runs smooth at 2x AntiAliasing, 16x Anisotropic Filtering, and HDR enabled. Water reflections when shining at things with a torch starts to slow things down a bit though. Additionally, in the 6600GT, it is proven that anything higher than 2x AntiAliasing, ie. 4x and above, cuts frame rates drastically. Whereas a 6800 probably does not have this "issue".

Going back to 3dMark05 though, the 7950 GX2 offers some incredible value. By 3Dmark05 scores as compared to the iMac Core Duo clocked by Apple, it is about 3.5x faster. Having two GPU cores on one card, you're getting an extra GPU core for about $100+.

Yeah if you do the math, one 7900-ish core is 2x faster than the iMac Core Duo, so two 7900-ish cores should be 4x faster. But of course, in 3DMark05 and realistic implementations in games, you're going to see the 7950 GX2 2.5x-3.5x faster than the iMac's GPU.
post #242 of 566
[QUOTE]Originally posted by ZachPruckowski
Let's look at prices:

$600: 7950GX2
$500: 7900GTX
$400: 7900GT
$300: 7800GT
$200: 7600GT
$150: 7600GS

I think Apple will be looking for a $200-300 card in a $2k-ish dual-Woodcrest, so I'm going to say the low-end will come with a 7600GT

Mid and High Ends could start with a 7800GT or a 7900GT with one of the top two as BTO, and a Quadro BTO.



Fair enough, although I think Apple will stinge on the lowest-end Mac Pro and put in a 7600GS*. The mid-higher end would be a 7600GT or 7800GT. 7900GTX, 7950GX2 and Quadro as BTO options. Apple has a tendency to leave mainstream mid-ends out, so I doubt we'll see the 7900GT.

Again though, compared to the 7900s, the 7950GX2 is right now offering some incredible value.

*Look at the lowest-end PowerMac G5: It still comes with a 6600LE - which is pretty crappy by todays standards. Pretty bloody crappy.
post #243 of 566
Quote:
Originally posted by sunilraman
*Look at the lowest-end PowerMac G5: It still comes with a 6600LE - which is pretty crappy by todays standards. Pretty bloody crappy.

Yes, although it's still complete overkill for most of the users that want PowerMacs. ie. non-gamers.

Why pay $600 for a graphics card if all you're doing is Photoshop and Final Cut ?

The only reason that type of user buys a more expensive card is to run two monitors, not to get 5 more fps in Half-Life.
post #244 of 566
Quote:
Originally posted by aegisdesign
The only reason that type of user buys a more expensive card is to run two monitors, not to get 5 more fps in Half-Life.

Doesn't the 6600LE have dual video outs? Apple's page says their version has one dual link and one single link video output.
post #245 of 566
Quote:
Originally posted by aegisdesign
Yes, although it's still complete overkill for most of the users that want PowerMacs. ie. non-gamers.

Why pay $600 for a graphics card if all you're doing is Photoshop and Final Cut ?

The point is that the difference (not as much as $600 though) does not go to customer's but to Apple's pocket.
post #246 of 566
[QUOTE]Originally posted by JeffDM
Doesn't the 6600LE have dual video outs? Apple's page says their version has one dual link and one single link video output.



My MSI 6600GT 128MB GDDR SDRAM has one dual-link DVI port and one VGA port. I think the 6600s have one dual-link DVI out and one single-link DVI out normally.

The Apple versions make sense, I don't think there are 6600s out there with two dual-link DVI outs, unless you can point me to a specific manufacturer model:

Yeah, Apple says:
NVIDIA GeForce 6600 LE with 128MB of GDDR SDRAM, one single-link DVI port, and one dual-link DVI port.

NVIDIA GeForce 6600 with 256MB of GDDR SDRAM, one single-link DVI port, and one dual-link DVI port

NVIDIA GeForce 7800 GT with 256MB of GDDR3 SDRAM, one single-link DVI port, and one dual-link DVI port

NVIDIA Quadro FX 4500 with 512MB of GDDR3 SDRAM, two dual-link DVI ports,

So with this, for the first three cards you can drive a 30" cinema display AND a 23" cinema display max. Driving two 30" cinema displays requires a Quadro FX 4500 or another 6600 card:

"Any new dual- or quad-core Power Mac G5 supports two Apple Cinema Displays, including dual-link DVI for one 30-inch model. Support for two 30-inch Apple Cinema HD Displays requires two dual-link DVI ports, available in configurations with the NVIDIA Quadro FX 4500 or by installing an additional NVIDIA GeForce 6600 card. Support for more than two displays requires installation of one or more additional NVIDIA GeForce 6600 cards."
post #247 of 566
Oops, it depends on the video card you get from nVidia and the manufacturers.

For example, the AGP 6600GT nVidia REFERENCE card looks like it has two dual-link dvi outs (based on looking at the pins)

http://www.trustedreviews.com/article.aspx?art=850

post #248 of 566
(from Wikipedia) image:

post #249 of 566
Quote:
Originally posted by sunilraman
"Any new dual- or quad-core Power Mac G5 supports two Apple Cinema Displays, including dual-link DVI for one 30-inch model. Support for two 30-inch Apple Cinema HD Displays requires two dual-link DVI ports, available in configurations with the NVIDIA Quadro FX 4500 or by installing an additional NVIDIA GeForce 6600 card. Support for more than two displays requires installation of one or more additional NVIDIA GeForce 6600 cards." [/B]

If one of the 6600 models are passively cooled, I'd have no problem doubling up if I wanted dual 30" screens. That Quadro takes two slots of space anyway. Maybe the Quadro's cooling is quiet, but just looking at it makes me wonder how loud that thing screams.
post #250 of 566
Yeah, here's a Galaxy 6600LE with two dual-link DVI outs, plus an s-Video out as well for good measure. (Image

post #251 of 566
[QUOTE]Originally posted by JeffDM
If one of the 6600 models are passively cooled, I'd have no problem doubling up if I wanted dual 30" screens. That Quadro takes two slots of space anyway. Maybe the Quadro's cooling is quiet, but just looking at it makes me wonder how loud that thing screams.



The Asus Silencer looks good for 6600 quiet operation. Not sure if you can just bang it into a PowerMac though...




Or you could get the 6600 from Apple, for full compatibility, and then bring on the Zalman UltraQuiet Heatsink and Fan: It would be virtually silent It does look quite thin so it should fit in one PCIExpress slot.

http://www.zalman.co.kr/eng/product/...x=201&code=013

post #252 of 566
Plus isn't it time you PowerMac G5 people had some NEON in your rigs?

(image)
post #253 of 566
Okay, cluttering this thread up with offtopic video card stuff, so final linky about installing an older Zalman cooler onto a Powermac G5 6800card:

http://www.xlr8yourmac.com/Graphics/...lmanVF700.html
post #254 of 566
The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.
post #255 of 566
ROFL It's true. The spec sheet of the MacPro as circulated by AppleInsider at the moment shows only an X1800 GTO on the higher end... (http://www.appleinsider.com/article.php?id=1886). I mean, come on, "ATI X1800 GTO" - okay, there is an X in the X1800 but the words that come after the numbers has NO X IN IT! OMFG!

But you are right. If the Mac Pro is going ATI then there should be a option for the Radeon X1900 XTX ! It has 2 X's in there after the numbers!!
Hmmph. The X1800GTO is the lowest card in the X1800 range, BTW.
post #256 of 566
Quote:
Originally posted by melgross
The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.

The GTs are second place, just after the GTX/Ultra variants. The GX2s are a special case. But that's for nVidia. Second place for ATi is XT, with first being the XTX, and GT being pretty low.

Oh, the confusion.

You can entertain yourself by looking at these charts:

http://en.wikipedia.org/wiki/Compari...ocessing_Units
http://en.wikipedia.org/wiki/Compari...ocessing_Units
post #257 of 566
Yeah... Wrapping one's head around GPU naming schemes is fun, fun fun.

nVidia, in descending order:
7900GX2
7900GTX
7900GT
7900GS
7600GT
7600GS
......

ATI, in descending order:
X1900 XTX \t
X1900 XT \t
X1900 GT
X1800 XT \t
X1800 XL \t
X1800 GTO
.......
post #258 of 566
[QUOTE]Originally posted by melgross
The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.



That said, I've been very happy with my nVidia 6600GT for the past year, and it's got another year of life in it. For playing the latest games at medium settings, with some antialiasing and quite a bit of anisotropic filtering.

The GTX-type variants are not worth the money, IMO, for example, going for a 7900GTX is more for bragging rights and feeling good about yourself if you have the cash to blow. Same with the 6800 Ultra when the 6 series was happening. Overkill, for mainstream, fun and fluid gaming.

I'll have to update myself on benchmarks of the nVidia 7900GT and 7600GT, but there's value for money there, I reckon.
post #259 of 566
I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.

Right now, raising the 6600GT to a 7800GT costs $350.

On Newegg, a 7800GT costs under $300, period. The 6600GT is about $125.

So effectively, when you configure a Powermac with a 7800GT (and they don't even offer the GTX which is a pity), you are paying almost $500 for a card that costs under $300 off-the-shelf.

I sincerely hope this changes with the Mac Pro.
post #260 of 566
I'm hoping Apple finds a way to implement this that neither compromises EFI features, nor being able to pick any off-the-shelf card.

Perhaps they can convince ATi and/or nVidia to simply provide replacement firmwares for download?
post #261 of 566
Quote:
Originally posted by Chucker
I'm hoping Apple finds a way to implement this that neither compromises EFI features, nor being able to pick any off-the-shelf card.

Perhaps they can convince ATi and/or nVidia to simply provide replacement firmwares for download?

The firmware in the non EFI cards may be too small
post #262 of 566
Quote:
Originally posted by JeffDM
That's not making sense though unless the 7600GT isn't capable of those features.

Maybe I'm misstating what I mean. I mean that the reason people buy the $400 or $500 graphics cards is because that extra 5 or 10 percent makes a big difference as to whether you can have all the bells and whistles on, or whether you're just running a "basic" view. If you want to see grass shadows, or in depth facial features, you need a high-end card.

Note that that generally only applies to current games. Games that are a year or two old will run great on a 7600GT.
post #263 of 566
Quote:
Originally posted by ZachPruckowski
Maybe I'm misstating what I mean. I mean that the reason people buy the $400 or $500 graphics cards is because that extra 5 or 10 percent makes a big difference as to whether you can have all the bells and whistles on, or whether you're just running a "basic" view. If you want to see grass shadows, or in depth facial features, you need a high-end card.

Note that that generally only applies to current games. Games that are a year or two old will run great on a 7600GT.

Or, turn on all the features and you'd still only be 5-10% slower than a more expensive card with all the features on. 5-10% slower framerate generally doesn't cross the line from being playable and unplayable with the same features running. If it's unplayable on a 7600GT, then an extra 5% FPS wouldn't help.

But, since my post, Sunil said he ran it at low resolution and not all features enabled, so that might change things quite a bit.
post #264 of 566
Quote:
Originally posted by Placebo
I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.

Right now, raising the 6600GT to a 7800GT costs $350.

On Newegg, a 7800GT costs under $300, period. The 6600GT is about $125.

So effectively, when you configure a Powermac with a 7800GT (and they don't even offer the GTX which is a pity), you are paying almost $500 for a card that costs under $300 off-the-shelf.

I sincerely hope this changes with the Mac Pro.

Well, remember that this is a constant price, from back when the 7800GT ruled the roost (or was 2nd from the top). Apple hasn't lowered prices in the BTO to reflect the change in the graphics card market.
post #265 of 566
[QUOTE]Originally posted by Placebo
I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.

Right now, raising the 6600GT to a 7800GT costs $350.......



Actually, there is NO 6600GT. It's just 6600LE and 6600 that's offered stock with the PowerMac G5. On Newegg, the 6600 is about $95 and 6600LE is about $75. Apple Store raising the 6600LE to 7800GT is $400 and raising 6600 to 7800GT is $350.

So looks like you're paying $475 for a 7800GT that's $300 if you start out with the 6600LE config, and paying $445 for a 7800GT that's $300 if you start out with the 6600 config.

MMM... Don't you just loooove those Apple margins? MMM mmmm...
post #266 of 566
[QUOTE]Originally posted by JeffDM
Or, turn on all the features and you'd still only be 5-10% slower than a more expensive card with all the features on. 5-10% slower framerate generally doesn't cross the line from being playable and unplayable with the same features running. If it's unplayable on a 7600GT, then an extra 5% FPS wouldn't help....But, since my post, Sunil said he ran it at low resolution and not all features enabled, so that might change things quite a bit.



Yeah, it's a real fetishistic thing in the GPU-wars Gaming World. You take the 6-series and 7-series from nVidia, and you take the X1300, X1600, X1800, X1900 cards from ATI.

Across the range, with future, current, and 1-year old games there are various features which improve visual quality. There is also a range of framerates which move from super smooth to unplayable.

3DMark05 and 3DMark06 is IMO the most sensible benchmark to get a grip on what the cards are capable of. But you would have to run a series of benchmarks across different resolutions and AntiAliasing and AnisotropicFiltering.

My simple comparisons were just based on trying to see what the options are against the X1600 in the iMac Core Duo and MacBookPro.

Even so, again, with different games, you'd have to run different benchmarks depending on the game engine -- usually GPU/hardware enthusiast sites take a sample game from the main popular game engines (iD Software, Unreal, Source, FEAR, miscellaneous). And run these benchmarks at different quality settings for each game and see how various cards perform.

Like Placebo says, most people want to get a decent card that isn't exorbitant. A 7600GT nVidia would generally give you playable, enjoyable gaming for current games at medium-to-high visual quality and features. A 7900GTX would give you very smooth framerates at maximum quality the game engine can deliver. Anything less than $150 is generally at this stage not worth the money unless you want to play the latest games and have a generally "basic" visual quality and resolution.

The catch here is with my experience with the 6600GT nVidia, overclocked from 500mhz to 561mhz with a off-the-shelf replacement heatsink and fan, I get nice "medium" quality stuff and playable framerates. From a card that was $200+ or so that has come down to $125 in the past year.

So it depends on your budget, what sort of visual quality you like, how much you want to brag to your friends, and how much bang-for-buck you feel you've got. AND, how much time you have to research the various GPUs and delve into the major "hardware enthusiast/ benchmarking" websites.
post #267 of 566
Here is a good discussion on what's playable and what different quality settings do for different games. It doesn't compare GPUs, but actually it shows that gaming with realistic settings and resolutions shows MINIMAL difference between Conroe and AMD's dualcores. Da GPU be the bottleneck. http://enthusiast.hardocp.com/articl...wxLCxobmV3cw==

Nonetheless, TomsHardware has clearly shown Conroe to whip everyone's butt in non-game scenarios. At lower power draw. Performance-per-watt, delivered on Conroe.
http://www.tomshardware.com/2006/07/...out_athlon_64/

Woodcrests will be tasty.
post #268 of 566
Quote:
Originally posted by sunilraman
Here is a good discussion on what's playable and what different quality settings do for different games. It doesn't compare GPUs, but actually it shows that gaming with realistic settings and resolutions shows MINIMAL difference between Conroe and AMD's dualcores. Da GPU be the bottleneck. http://enthusiast.hardocp.com/articl...wxLCxobmV3cw==

That's a rather controversial and senstionalist review considering the 36 page forum discussion on it.

1. The review language was inflamatory. Most people don't need a multi-page review to know that modern first person shooter style games are GPU limited when using 1 video card. If you have a single video card, and all you do is play first person shooters, you don't need anything more than a $300 CPU.

2. If a multi-video card setup was used, a Core 2 Duo setup would have given a 10 to 20% increase in framesrates like all the other reviews with Crossfire had, at least for the games that weren't bottlenecked with a Crossfire setup.

3. It doesn't leave a good impression when I read this:

"The ONLY difference that we experienced is that we did have to lower a couple of settings with the AMD Athlon 64 FX-62 platform compared to the Intel platforms. This was the internal and external shadows. Luckily, the shadow sliders there are notched so it is easy to know exactly what position they are in. With the Intel CPUs, we were able to have this 5 notches up, which is in the middle of the slider for those shadow options. When we tried these same settings on the AMD Athlon 64 FX-62 platform, we found performance to be overall lower than the Intel CPUs and not playable. By moving those sliders down a couple of notches to 3 notches on the slider, performance was now playable."

4. The review reiterates standard upgrading advice. If you already have a state-of-the-art for your dollar system, you don't need to upgrade until you have the money to upgrade or until your system becomes obselete 1 to 2 years in the future, whichever comes first.

5. "Real-world gaming" isn't an objective term. People have different system setups capable of different performance. Gaming isn't exclusively 3D first person shooter style games, etc.

Quote:
Nonetheless, TomsHardware has clearly shown Conroe to whip everyone's butt in non-game scenarios. At lower power draw. Performance-per-watt, delivered on Conroe.
http://www.tomshardware.com/2006/07/...out_athlon_64/

Woodcrests will be tasty.

No doubt. Considering Apple's market, content creation and multimedia in various forms, Core 2 Duo and Xeon 51xx systems are the only way to go.
post #269 of 566
Quote:
Originally posted by kukito
Intel has a cheap (at least according to them) water cooling solution. It was designed for Extreme Edition systems but I imagine it could be adapted to Woodcrest.

Intel and AMD are both working on "reverse hyperthreading", also known as Core Multiplexing Technology and is a rumored feature for the Core 2 Duo. In theory it would allow one thread to be split up among the two cores to increase performance.

FYI ...

AMD reverse HyperThreading declared a Myth
post #270 of 566
[QUOTE]Originally posted by kukito
Intel has a cheap (at least according to them) water cooling solution. It was designed for Extreme Edition systems but I imagine it could be adapted to Woodcrest.



Tomshardware has shown some interesting overclocking of Conroe on air with the stock intel cooler (http://www.tomshardware.com/2006/07/...out_athlon_64/)

This leads me to believe/ hope that Woodcrests, even the top of the line Mac Pro, will *NOT* be watercooled. For costs, simplicity reasons as well. Hopefully it will be a heatsink-heatpipe combo out to quiet fans.
post #271 of 566
[QUOTE]Originally posted by THT
That's a rather controversial and senstionalist review considering the 36 page forum discussion on it.


It's just a matter of the AMD fanbois and Intel kids fightin' it out. From Apple and Mac users' perspective, Conroe and Woodcrest can only be good things, and overall it looks like Apple bet on the right horse.


1. The review language was inflamatory. Most people don't need a multi-page review to know that modern first person shooter style games are GPU limited when using 1 video card. If you have a single video card, and all you do is play first person shooters, you don't need anything more than a $300 CPU.

I must say that it leaned towards trying to vindicate AMD when "normal, playable, usual" game settings are used. Their framerate charts are VERY useful, IMO, in giving people an idea of how the CPU might influence playability at certain visual qualities.


2. If a multi-video card setup was used, a Core 2 Duo setup would have given a 10 to 20% increase in framesrates like all the other reviews with Crossfire had, at least for the games that weren't bottlenecked with a Crossfire setup.

Well, yeah, we end up back with the Tomshardware review which had run games at lower settings to lift the GPU bottleneck. In their case, Or with other reviews that say use Crossfire, the purpose is to eliminate the GPU factor to see how the CPU might theoretically affect framerates in such setups.


3. It doesn't leave a good impression when I read this:

"The ONLY difference that we experienced is that we did have to lower a couple of settings with the AMD Athlon 64 FX-62 platform compared to the Intel platforms. This was the internal and external shadows. Luckily, the shadow sliders there are notched so it is easy to know exactly what position they are in. With the Intel CPUs, we were able to have this 5 notches up, which is in the middle of the slider for those shadow options. When we tried these same settings on the AMD Athlon 64 FX-62 platform, we found performance to be overall lower than the Intel CPUs and not playable. By moving those sliders down a couple of notches to 3 notches on the slider, performance was now playable.
"

That's very interesting that shadow computations are somehow being more CPU dependent. For the particular game they were discussing.


4. The review reiterates standard upgrading advice. If you already have a state-of-the-art for your dollar system, you don't need to upgrade until you have the money to upgrade or until your system becomes obselete 1 to 2 years in the future, whichever comes first.

5. "Real-world gaming" isn't an objective term. People have different system setups capable of different performance. Gaming isn't exclusively 3D first person shooter style games, etc.

No doubt. Considering Apple's market, content creation and multimedia in various forms, Core 2 Duo and Xeon 51xx systems are the only way to go.



Agreed :thumbs: At WWDC Stevie J is gonna be up on stage, proudly beaming, "The Intel Transition: We Did It" and then rattle off some huge figure of shipping Universal Binaries. Overall, The Apple suite of content creation and other UBs look good for the Mac Pro. Adobe/ Macromedia/ MsOffice have been discussed thoroughly and strategies for dealing with that have also been discussed thoroughly. Stevie J pushed hard for the transition, and it happened, it is in place. Quite an achievement, another feather in his cap. Apple has not been bulletproof in the transition, but things are looking okay. The stock price, on the other hand.....
post #272 of 566
Quote:
Originally posted by sunilraman
ROFL It's true. The spec sheet of the MacPro as circulated by AppleInsider at the moment shows only an X1800 GTO on the higher end... (http://www.appleinsider.com/article.php?id=1886). I mean, come on, "ATI X1800 GTO" - okay, there is an X in the X1800 but the words that come after the numbers has NO X IN IT! OMFG!

But you are right. If the Mac Pro is going ATI then there should be a option for the Radeon X1900 XTX ! It has 2 X's in there after the numbers!!
Hmmph. The X1800GTO is the lowest card in the X1800 range, BTW.

Well, yeah. I meant at the rear, where the number means which chip AND card it is. The first X has nothing to do with it. I thought that was understood.
post #273 of 566
Quote:
Originally posted by sunilraman
Yeah... Wrapping one's head around GPU naming schemes is fun, fun fun.

nVidia, in descending order:
7900GX2
7900GTX
7900GT
7900GS
7600GT
7600GS
......

ATI, in descending order:
X1900 XTX \t
X1900 XT \t
X1900 GT
X1800 XT \t
X1800 XL \t
X1800 GTO
.......

And the 1800's are really the leftovers from the older chip line. I would think that it was understood as well, if someone is keeping track.
post #274 of 566
[QUOTE]Originally posted by sunilraman
Quote:
Originally posted by melgross
The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.



That said, I've been very happy with my nVidia 6600GT for the past year, and it's got another year of life in it. For playing the latest games at medium settings, with some antialiasing and quite a bit of anisotropic filtering.

The GTX-type variants are not worth the money, IMO, for example, going for a 7900GTX is more for bragging rights and feeling good about yourself if you have the cash to blow. Same with the 6800 Ultra when the 6 series was happening. Overkill, for mainstream, fun and fluid gaming.

I'll have to update myself on benchmarks of the nVidia 7900GT and 7600GT, but there's value for money there, I reckon.

Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.
post #275 of 566
Quote:
Originally posted by Placebo
I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.

Right now, raising the 6600GT to a 7800GT costs $350.

On Newegg, a 7800GT costs under $300, period. The 6600GT is about $125.

So effectively, when you configure a Powermac with a 7800GT (and they don't even offer the GTX which is a pity), you are paying almost $500 for a card that costs under $300 off-the-shelf.

I sincerely hope this changes with the Mac Pro.

Yes! I totally agree. This is frustrating, to say the least.

We can't even count on some third party coming out with a much better card for us. Otherwise, we could just go with the cheapest Apple offers, and wait a short while, and then buy some other card.

But, no such luck!
post #276 of 566
[QUOTE]Originally posted by melgross
Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.



1. You're assuming a certain constant, expected level of income/ budgeting. For some people with varying income over time, getting something with decent value for money that they can afford, that's what they'll do. If they cannot be or are not confident about their income two years down the line, then they'll assume from the start, right, I'll get this card, it's affordable, it plays the latest games decently, and hey, maybe I'll hang on to it for up to three years. Three and a half if my budget is focused on something else (like a mortage, and maybe you play games less and less)...

2. Yes, a lower spec card will mean you have to replace your card more often. But for some people the sticker shock of buying the latest and greatest might be too much. Some people are just sensitive to cash flow issues so paying a premium for the super GTX is too much emotionally for them. Also, with the speed of change, especially in GPU land, there is a certain level of risk that your latest and greatest won't last as much as you like. One might reasonably argue that you might spend more buying mid-level GT cards every two years, rather than a super GTX every 3.5 years, but for the avid gamer they would feel more connected to the latest trends and have support of the latest features (eg. Pixel Shader 3, HDR and the like). Sure, your super GTX would run a lot of games great for 3 years, but in the third year you are very likely to be missing out on whatever the GPU innovations are for that year and onwards, like "double- inverse- shadow- shader- pixel- fluffing- high- intensity- triple- data- 512bit- multispawn- n-dimensional- scalar- rendering" or whatever the F*** comes out down the line.
post #277 of 566
[QUOTE]Originally posted by melgross
Well, yeah. I meant at the rear, where the number means which chip AND card it is. The first X has nothing to do with it. I thought that was understood.



Well, GPU marketing people like X's in the name. ATI figured out, hey, we can have three X's in "ATI X1900 XTX" which is better than "nVidia 7900GTX" - muah hah hah aha ha ha it only has one X in there...
post #278 of 566
Quote:
Originally posted by melgross
Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.

The problem here is that the extra benefit for the money severely reduces when you get above the sweet spot, and spending the extra money above the sweet spot doesn't get enough longevity for the money either. Buying a $200 card now and a $200 card a year from now is likely money better spent than buying a $400 card to last two years.
post #279 of 566
[QUOTE]Originally posted by sunilraman
Quote:
Originally posted by melgross
Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.



1. You're assuming a certain constant, expected level of income/ budgeting. For some people with varying income over time, getting something with decent value for money that they can afford, that's what they'll do. If they cannot be or are not confident about their income two years down the line, then they'll assume from the start, right, I'll get this card, it's affordable, it plays the latest games decently, and hey, maybe I'll hang on to it for up to three years. Three and a half if my budget is focused on something else (like a mortage, and maybe you play games less and less)...

That's a different issue. One buys what one can afford.

But financial issues aside, buying cheaper boards more often is more of a bother, and you NEVER get top performance that way. By buying a top board, you can keep it longer, and get better performance than any middle board for at least half of its life.

Quote:
2. Yes, a lower spec card will mean you have to replace your card more often. But for some people the sticker shock of buying the latest and greatest might be too much. Some people are just sensitive to cash flow issues so paying a premium for the super GTX is too much emotionally for them. Also, with the speed of change, especially in GPU land, there is a certain level of risk that your latest and greatest won't last as much as you like. One might reasonably argue that you might spend more buying mid-level GT cards every two years, rather than a super GTX every 3.5 years, but for the avid gamer they would feel more connected to the latest trends and have support of the latest features (eg. Pixel Shader 3, HDR and the like). Sure, your super GTX would run a lot of games great for 3 years, but in the third year you are very likely to be missing out on whatever the GPU innovations are for that year and onwards, like "double- inverse- shadow- shader- pixel- fluffing- high- intensity- triple- data- 512bit- multispawn- n-dimensional- scalar- rendering" or whatever the F*** comes out down the line.

Lower model boards often don't support the latest features until the top board is replaced by the next generation. Then the old top board becomes the new middle board. And so on.
post #280 of 566
Quote:
Originally posted by JeffDM
The problem here is that the extra benefit for the money severely reduces when you get above the sweet spot, and spending the extra money above the sweet spot doesn't get enough longevity for the money either. Buying a $200 card now and a $200 card a year from now is likely money better spent than buying a $400 card to last two years.

I don't agree with that. A top card is much better than the "sweet spot". That's like a "best buy". Never quite that good, but cheap enough for the masses.

My answer to Sunil, above, pertains.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Intel unleashes Mac-bound "Woodcrest" server chip