Intel unleashes Mac-bound "Woodcrest" server chip

1111214161729

Comments

  • Reply 261 of 565
    Quote:

    Originally posted by JeffDM

    That's not making sense though unless the 7600GT isn't capable of those features.



    Maybe I'm misstating what I mean. I mean that the reason people buy the $400 or $500 graphics cards is because that extra 5 or 10 percent makes a big difference as to whether you can have all the bells and whistles on, or whether you're just running a "basic" view. If you want to see grass shadows, or in depth facial features, you need a high-end card.



    Note that that generally only applies to current games. Games that are a year or two old will run great on a 7600GT.
  • Reply 262 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by ZachPruckowski

    Maybe I'm misstating what I mean. I mean that the reason people buy the $400 or $500 graphics cards is because that extra 5 or 10 percent makes a big difference as to whether you can have all the bells and whistles on, or whether you're just running a "basic" view. If you want to see grass shadows, or in depth facial features, you need a high-end card.



    Note that that generally only applies to current games. Games that are a year or two old will run great on a 7600GT.




    Or, turn on all the features and you'd still only be 5-10% slower than a more expensive card with all the features on. 5-10% slower framerate generally doesn't cross the line from being playable and unplayable with the same features running. If it's unplayable on a 7600GT, then an extra 5% FPS wouldn't help.



    But, since my post, Sunil said he ran it at low resolution and not all features enabled, so that might change things quite a bit.
  • Reply 263 of 565
    Quote:

    Originally posted by Placebo

    I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.



    Right now, raising the 6600GT to a 7800GT costs $350.



    On Newegg, a 7800GT costs under $300, period. The 6600GT is about $125.



    So effectively, when you configure a Powermac with a 7800GT (and they don't even offer the GTX which is a pity), you are paying almost $500 for a card that costs under $300 off-the-shelf.



    I sincerely hope this changes with the Mac Pro.




    Well, remember that this is a constant price, from back when the 7800GT ruled the roost (or was 2nd from the top). Apple hasn't lowered prices in the BTO to reflect the change in the graphics card market.
  • Reply 264 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by Placebo

    I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.



    Right now, raising the 6600GT to a 7800GT costs $350.......






    Actually, there is NO 6600GT. It's just 6600LE and 6600 that's offered stock with the PowerMac G5. On Newegg, the 6600 is about $95 and 6600LE is about $75. Apple Store raising the 6600LE to 7800GT is $400 and raising 6600 to 7800GT is $350.



    So looks like you're paying $475 for a 7800GT that's $300 if you start out with the 6600LE config, and paying $445 for a 7800GT that's $300 if you start out with the 6600 config.



    MMM... Don't you just loooove those Apple margins? MMM mmmm...
  • Reply 265 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by JeffDM

    Or, turn on all the features and you'd still only be 5-10% slower than a more expensive card with all the features on. 5-10% slower framerate generally doesn't cross the line from being playable and unplayable with the same features running. If it's unplayable on a 7600GT, then an extra 5% FPS wouldn't help....But, since my post, Sunil said he ran it at low resolution and not all features enabled, so that might change things quite a bit.






    Yeah, it's a real fetishistic thing in the GPU-wars Gaming World. You take the 6-series and 7-series from nVidia, and you take the X1300, X1600, X1800, X1900 cards from ATI.



    Across the range, with future, current, and 1-year old games there are various features which improve visual quality. There is also a range of framerates which move from super smooth to unplayable.



    3DMark05 and 3DMark06 is IMO the most sensible benchmark to get a grip on what the cards are capable of. But you would have to run a series of benchmarks across different resolutions and AntiAliasing and AnisotropicFiltering.



    My simple comparisons were just based on trying to see what the options are against the X1600 in the iMac Core Duo and MacBookPro.



    Even so, again, with different games, you'd have to run different benchmarks depending on the game engine -- usually GPU/hardware enthusiast sites take a sample game from the main popular game engines (iD Software, Unreal, Source, FEAR, miscellaneous). And run these benchmarks at different quality settings for each game and see how various cards perform.



    Like Placebo says, most people want to get a decent card that isn't exorbitant. A 7600GT nVidia would generally give you playable, enjoyable gaming for current games at medium-to-high visual quality and features. A 7900GTX would give you very smooth framerates at maximum quality the game engine can deliver. Anything less than $150 is generally at this stage not worth the money unless you want to play the latest games and have a generally "basic" visual quality and resolution.



    The catch here is with my experience with the 6600GT nVidia, overclocked from 500mhz to 561mhz with a off-the-shelf replacement heatsink and fan, I get nice "medium" quality stuff and playable framerates. From a card that was $200+ or so that has come down to $125 in the past year.



    So it depends on your budget, what sort of visual quality you like, how much you want to brag to your friends, and how much bang-for-buck you feel you've got. AND, how much time you have to research the various GPUs and delve into the major "hardware enthusiast/ benchmarking" websites.
  • Reply 266 of 565
    sunilramansunilraman Posts: 8,133member
    Here is a good discussion on what's playable and what different quality settings do for different games. It doesn't compare GPUs, but actually it shows that gaming with realistic settings and resolutions shows MINIMAL difference between Conroe and AMD's dualcores. Da GPU be the bottleneck. http://enthusiast.hardocp.com/articl...wxLCxobmV3cw==



    Nonetheless, TomsHardware has clearly shown Conroe to whip everyone's butt in non-game scenarios. At lower power draw. Performance-per-watt, delivered on Conroe.

    http://www.tomshardware.com/2006/07/...out_athlon_64/



    Woodcrests will be tasty.
  • Reply 267 of 565
    thttht Posts: 5,616member
    Quote:

    Originally posted by sunilraman

    Here is a good discussion on what's playable and what different quality settings do for different games. It doesn't compare GPUs, but actually it shows that gaming with realistic settings and resolutions shows MINIMAL difference between Conroe and AMD's dualcores. Da GPU be the bottleneck. http://enthusiast.hardocp.com/articl...wxLCxobmV3cw==





    That's a rather controversial and senstionalist review considering the 36 page forum discussion on it.



    1. The review language was inflamatory. Most people don't need a multi-page review to know that modern first person shooter style games are GPU limited when using 1 video card. If you have a single video card, and all you do is play first person shooters, you don't need anything more than a $300 CPU.



    2. If a multi-video card setup was used, a Core 2 Duo setup would have given a 10 to 20% increase in framesrates like all the other reviews with Crossfire had, at least for the games that weren't bottlenecked with a Crossfire setup.



    3. It doesn't leave a good impression when I read this:



    "The ONLY difference that we experienced is that we did have to lower a couple of settings with the AMD Athlon 64 FX-62 platform compared to the Intel platforms. This was the internal and external shadows. Luckily, the shadow sliders there are ?notched? so it is easy to know exactly what position they are in. With the Intel CPUs, we were able to have this 5 notches up, which is in the middle of the slider for those shadow options. When we tried these same settings on the AMD Athlon 64 FX-62 platform, we found performance to be overall lower than the Intel CPUs and not playable. By moving those sliders down a couple of notches to 3 notches on the slider, performance was now playable."



    4. The review reiterates standard upgrading advice. If you already have a state-of-the-art for your dollar system, you don't need to upgrade until you have the money to upgrade or until your system becomes obselete 1 to 2 years in the future, whichever comes first.



    5. "Real-world gaming" isn't an objective term. People have different system setups capable of different performance. Gaming isn't exclusively 3D first person shooter style games, etc.



    Quote:

    Nonetheless, TomsHardware has clearly shown Conroe to whip everyone's butt in non-game scenarios. At lower power draw. Performance-per-watt, delivered on Conroe.

    http://www.tomshardware.com/2006/07/...out_athlon_64/



    Woodcrests will be tasty.




    No doubt. Considering Apple's market, content creation and multimedia in various forms, Core 2 Duo and Xeon 51xx systems are the only way to go.
  • Reply 268 of 565
    mwswamimwswami Posts: 166member
    Quote:

    Originally posted by kukito

    Intel has a cheap (at least according to them) water cooling solution. It was designed for Extreme Edition systems but I imagine it could be adapted to Woodcrest.



    Intel and AMD are both working on "reverse hyperthreading", also known as Core Multiplexing Technology and is a rumored feature for the Core 2 Duo. In theory it would allow one thread to be split up among the two cores to increase performance.




    FYI ...



    AMD reverse HyperThreading declared a Myth
  • Reply 269 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by kukito

    Intel has a cheap (at least according to them) water cooling solution. It was designed for Extreme Edition systems but I imagine it could be adapted to Woodcrest.






    Tomshardware has shown some interesting overclocking of Conroe on air with the stock intel cooler (http://www.tomshardware.com/2006/07/...out_athlon_64/)



    This leads me to believe/ hope that Woodcrests, even the top of the line Mac Pro, will *NOT* be watercooled. For costs, simplicity reasons as well. Hopefully it will be a heatsink-heatpipe combo out to quiet fans.
  • Reply 270 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by THT

    That's a rather controversial and senstionalist review considering the 36 page forum discussion on it.




    It's just a matter of the AMD fanbois and Intel kids fightin' it out. From Apple and Mac users' perspective, Conroe and Woodcrest can only be good things, and overall it looks like Apple bet on the right horse.





    1. The review language was inflamatory. Most people don't need a multi-page review to know that modern first person shooter style games are GPU limited when using 1 video card. If you have a single video card, and all you do is play first person shooters, you don't need anything more than a $300 CPU.



    I must say that it leaned towards trying to vindicate AMD when "normal, playable, usual" game settings are used. Their framerate charts are VERY useful, IMO, in giving people an idea of how the CPU might influence playability at certain visual qualities.





    2. If a multi-video card setup was used, a Core 2 Duo setup would have given a 10 to 20% increase in framesrates like all the other reviews with Crossfire had, at least for the games that weren't bottlenecked with a Crossfire setup.



    Well, yeah, we end up back with the Tomshardware review which had run games at lower settings to lift the GPU bottleneck. In their case, Or with other reviews that say use Crossfire, the purpose is to eliminate the GPU factor to see how the CPU might theoretically affect framerates in such setups.





    3. It doesn't leave a good impression when I read this:



    "The ONLY difference that we experienced is that we did have to lower a couple of settings with the AMD Athlon 64 FX-62 platform compared to the Intel platforms. This was the internal and external shadows. Luckily, the shadow sliders there are ?notched? so it is easy to know exactly what position they are in. With the Intel CPUs, we were able to have this 5 notches up, which is in the middle of the slider for those shadow options. When we tried these same settings on the AMD Athlon 64 FX-62 platform, we found performance to be overall lower than the Intel CPUs and not playable. By moving those sliders down a couple of notches to 3 notches on the slider, performance was now playable.
    "



    That's very interesting that shadow computations are somehow being more CPU dependent. For the particular game they were discussing.





    4. The review reiterates standard upgrading advice. If you already have a state-of-the-art for your dollar system, you don't need to upgrade until you have the money to upgrade or until your system becomes obselete 1 to 2 years in the future, whichever comes first.



    5. "Real-world gaming" isn't an objective term. People have different system setups capable of different performance. Gaming isn't exclusively 3D first person shooter style games, etc.



    No doubt. Considering Apple's market, content creation and multimedia in various forms, Core 2 Duo and Xeon 51xx systems are the only way to go.






    Agreed :thumbs: At WWDC Stevie J is gonna be up on stage, proudly beaming, "The Intel Transition: We Did It" and then rattle off some huge figure of shipping Universal Binaries. Overall, The Apple suite of content creation and other UBs look good for the Mac Pro. Adobe/ Macromedia/ MsOffice have been discussed thoroughly and strategies for dealing with that have also been discussed thoroughly. Stevie J pushed hard for the transition, and it happened, it is in place. Quite an achievement, another feather in his cap. Apple has not been bulletproof in the transition, but things are looking okay. The stock price, on the other hand.....
  • Reply 271 of 565
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by sunilraman

    ROFL It's true. The spec sheet of the MacPro as circulated by AppleInsider at the moment shows only an X1800 GTO on the higher end... (http://www.appleinsider.com/article.php?id=1886). I mean, come on, "ATI X1800 GTO" - okay, there is an X in the X1800 but the words that come after the numbers has NO X IN IT! OMFG!



    But you are right. If the Mac Pro is going ATI then there should be a option for the Radeon X1900 XTX ! It has 2 X's in there after the numbers!!

    Hmmph. The X1800GTO is the lowest card in the X1800 range, BTW.




    Well, yeah. I meant at the rear, where the number means which chip AND card it is. The first X has nothing to do with it. I thought that was understood.
  • Reply 272 of 565
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by sunilraman

    Yeah... Wrapping one's head around GPU naming schemes is fun, fun fun.



    nVidia, in descending order:

    7900GX2

    7900GTX

    7900GT

    7900GS

    7600GT

    7600GS

    ......



    ATI, in descending order:

    X1900 XTX \t

    X1900 XT \t

    X1900 GT

    X1800 XT \t

    X1800 XL \t

    X1800 GTO

    .......




    And the 1800's are really the leftovers from the older chip line. I would think that it was understood as well, if someone is keeping track.
  • Reply 273 of 565
    melgrossmelgross Posts: 33,600member
    [QUOTE]Originally posted by sunilraman

    Quote:

    Originally posted by melgross

    The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.






    That said, I've been very happy with my nVidia 6600GT for the past year, and it's got another year of life in it. For playing the latest games at medium settings, with some antialiasing and quite a bit of anisotropic filtering.



    The GTX-type variants are not worth the money, IMO, for example, going for a 7900GTX is more for bragging rights and feeling good about yourself if you have the cash to blow. Same with the 6800 Ultra when the 6 series was happening. Overkill, for mainstream, fun and fluid gaming.



    I'll have to update myself on benchmarks of the nVidia 7900GT and 7600GT, but there's value for money there, I reckon.



    Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.
  • Reply 274 of 565
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by Placebo

    I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.



    Right now, raising the 6600GT to a 7800GT costs $350.



    On Newegg, a 7800GT costs under $300, period. The 6600GT is about $125.



    So effectively, when you configure a Powermac with a 7800GT (and they don't even offer the GTX which is a pity), you are paying almost $500 for a card that costs under $300 off-the-shelf.



    I sincerely hope this changes with the Mac Pro.




    Yes! I totally agree. This is frustrating, to say the least.



    We can't even count on some third party coming out with a much better card for us. Otherwise, we could just go with the cheapest Apple offers, and wait a short while, and then buy some other card.



    But, no such luck!
  • Reply 275 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by melgross

    Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.






    1. You're assuming a certain constant, expected level of income/ budgeting. For some people with varying income over time, getting something with decent value for money that they can afford, that's what they'll do. If they cannot be or are not confident about their income two years down the line, then they'll assume from the start, right, I'll get this card, it's affordable, it plays the latest games decently, and hey, maybe I'll hang on to it for up to three years. Three and a half if my budget is focused on something else (like a mortage, and maybe you play games less and less)...



    2. Yes, a lower spec card will mean you have to replace your card more often. But for some people the sticker shock of buying the latest and greatest might be too much. Some people are just sensitive to cash flow issues so paying a premium for the super GTX is too much emotionally for them. Also, with the speed of change, especially in GPU land, there is a certain level of risk that your latest and greatest won't last as much as you like. One might reasonably argue that you might spend more buying mid-level GT cards every two years, rather than a super GTX every 3.5 years, but for the avid gamer they would feel more connected to the latest trends and have support of the latest features (eg. Pixel Shader 3, HDR and the like). Sure, your super GTX would run a lot of games great for 3 years, but in the third year you are very likely to be missing out on whatever the GPU innovations are for that year and onwards, like "double- inverse- shadow- shader- pixel- fluffing- high- intensity- triple- data- 512bit- multispawn- n-dimensional- scalar- rendering" or whatever the F*** comes out down the line.
  • Reply 276 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by melgross

    Well, yeah. I meant at the rear, where the number means which chip AND card it is. The first X has nothing to do with it. I thought that was understood.






    Well, GPU marketing people like X's in the name. ATI figured out, hey, we can have three X's in "ATI X1900 XTX" which is better than "nVidia 7900GTX" - muah hah hah aha ha ha it only has one X in there...
  • Reply 277 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by melgross

    Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.



    The problem here is that the extra benefit for the money severely reduces when you get above the sweet spot, and spending the extra money above the sweet spot doesn't get enough longevity for the money either. Buying a $200 card now and a $200 card a year from now is likely money better spent than buying a $400 card to last two years.
  • Reply 278 of 565
    melgrossmelgross Posts: 33,600member
    [QUOTE]Originally posted by sunilraman

    Quote:

    Originally posted by melgross

    Yes, value for money. It just means that you have to replace your card more often. I don't find that useful.






    1. You're assuming a certain constant, expected level of income/ budgeting. For some people with varying income over time, getting something with decent value for money that they can afford, that's what they'll do. If they cannot be or are not confident about their income two years down the line, then they'll assume from the start, right, I'll get this card, it's affordable, it plays the latest games decently, and hey, maybe I'll hang on to it for up to three years. Three and a half if my budget is focused on something else (like a mortage, and maybe you play games less and less)...



    That's a different issue. One buys what one can afford.



    But financial issues aside, buying cheaper boards more often is more of a bother, and you NEVER get top performance that way. By buying a top board, you can keep it longer, and get better performance than any middle board for at least half of its life.



    Quote:

    2. Yes, a lower spec card will mean you have to replace your card more often. But for some people the sticker shock of buying the latest and greatest might be too much. Some people are just sensitive to cash flow issues so paying a premium for the super GTX is too much emotionally for them. Also, with the speed of change, especially in GPU land, there is a certain level of risk that your latest and greatest won't last as much as you like. One might reasonably argue that you might spend more buying mid-level GT cards every two years, rather than a super GTX every 3.5 years, but for the avid gamer they would feel more connected to the latest trends and have support of the latest features (eg. Pixel Shader 3, HDR and the like). Sure, your super GTX would run a lot of games great for 3 years, but in the third year you are very likely to be missing out on whatever the GPU innovations are for that year and onwards, like "double- inverse- shadow- shader- pixel- fluffing- high- intensity- triple- data- 512bit- multispawn- n-dimensional- scalar- rendering" or whatever the F*** comes out down the line.



    Lower model boards often don't support the latest features until the top board is replaced by the next generation. Then the old top board becomes the new middle board. And so on.
  • Reply 279 of 565
    melgrossmelgross Posts: 33,600member
    Quote:

    Originally posted by JeffDM

    The problem here is that the extra benefit for the money severely reduces when you get above the sweet spot, and spending the extra money above the sweet spot doesn't get enough longevity for the money either. Buying a $200 card now and a $200 card a year from now is likely money better spent than buying a $400 card to last two years.



    I don't agree with that. A top card is much better than the "sweet spot". That's like a "best buy". Never quite that good, but cheap enough for the masses.



    My answer to Sunil, above, pertains.
  • Reply 280 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by melgross

    I don't agree with that. A top card is much better than the "sweet spot". That's like a "best buy". Never quite that good, but cheap enough for the masses.



    My answer to Sunil, above, pertains.




    You don't have to agree with it, I've never observed my statement to be false. I've only rarely seen a situation where the top of the line unit in electronics components gets 50% better performance than a unit of the same brand that's half the price. Add to that that the prices of graphics cards drop like a rock, next year's $200 card could easily outperform today's $400 card.
Sign In or Register to comment.