Intel unleashes Mac-bound "Woodcrest" server chip

1101113151629

Comments

  • Reply 241 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by ZachPruckowski

    Let's look at prices:



    $600: 7950GX2

    $500: 7900GTX

    $400: 7900GT

    $300: 7800GT

    $200: 7600GT

    $150: 7600GS



    I think Apple will be looking for a $200-300 card in a $2k-ish dual-Woodcrest, so I'm going to say the low-end will come with a 7600GT



    Mid and High Ends could start with a 7800GT or a 7900GT with one of the top two as BTO, and a Quadro BTO.






    Fair enough, although I think Apple will stinge on the lowest-end Mac Pro and put in a 7600GS*. The mid-higher end would be a 7600GT or 7800GT. 7900GTX, 7950GX2 and Quadro as BTO options. Apple has a tendency to leave mainstream mid-ends out, so I doubt we'll see the 7900GT.



    Again though, compared to the 7900s, the 7950GX2 is right now offering some incredible value.



    *Look at the lowest-end PowerMac G5: It still comes with a 6600LE - which is pretty crappy by todays standards. Pretty bloody crappy.
  • Reply 242 of 565
    aegisdesignaegisdesign Posts: 2,914member
    Quote:

    Originally posted by sunilraman

    *Look at the lowest-end PowerMac G5: It still comes with a 6600LE - which is pretty crappy by todays standards. Pretty bloody crappy.



    Yes, although it's still complete overkill for most of the users that want PowerMacs. ie. non-gamers.



    Why pay $600 for a graphics card if all you're doing is Photoshop and Final Cut ?



    The only reason that type of user buys a more expensive card is to run two monitors, not to get 5 more fps in Half-Life.
  • Reply 243 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by aegisdesign

    The only reason that type of user buys a more expensive card is to run two monitors, not to get 5 more fps in Half-Life.



    Doesn't the 6600LE have dual video outs? Apple's page says their version has one dual link and one single link video output.
  • Reply 244 of 565
    pbpb Posts: 4,255member
    Quote:

    Originally posted by aegisdesign

    Yes, although it's still complete overkill for most of the users that want PowerMacs. ie. non-gamers.



    Why pay $600 for a graphics card if all you're doing is Photoshop and Final Cut ?





    The point is that the difference (not as much as $600 though) does not go to customer's but to Apple's pocket.
  • Reply 245 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by JeffDM

    Doesn't the 6600LE have dual video outs? Apple's page says their version has one dual link and one single link video output.






    My MSI 6600GT 128MB GDDR SDRAM has one dual-link DVI port and one VGA port. I think the 6600s have one dual-link DVI out and one single-link DVI out normally.



    The Apple versions make sense, I don't think there are 6600s out there with two dual-link DVI outs, unless you can point me to a specific manufacturer model:



    Yeah, Apple says:

    NVIDIA GeForce 6600 LE with 128MB of GDDR SDRAM, one single-link DVI port, and one dual-link DVI port.



    NVIDIA GeForce 6600 with 256MB of GDDR SDRAM, one single-link DVI port, and one dual-link DVI port



    NVIDIA GeForce 7800 GT with 256MB of GDDR3 SDRAM, one single-link DVI port, and one dual-link DVI port



    NVIDIA Quadro FX 4500 with 512MB of GDDR3 SDRAM, two dual-link DVI ports,



    So with this, for the first three cards you can drive a 30" cinema display AND a 23" cinema display max. Driving two 30" cinema displays requires a Quadro FX 4500 or another 6600 card:



    "Any new dual- or quad-core Power Mac G5 supports two Apple Cinema Displays, including dual-link DVI for one 30-inch model. Support for two 30-inch Apple Cinema HD Displays requires two dual-link DVI ports, available in configurations with the NVIDIA Quadro FX 4500 or by installing an additional NVIDIA GeForce 6600 card. Support for more than two displays requires installation of one or more additional NVIDIA GeForce 6600 cards."
  • Reply 246 of 565
    sunilramansunilraman Posts: 8,133member
    Oops, it depends on the video card you get from nVidia and the manufacturers.



    For example, the AGP 6600GT nVidia REFERENCE card looks like it has two dual-link dvi outs (based on looking at the pins)



    http://www.trustedreviews.com/article.aspx?art=850



  • Reply 247 of 565
    sunilramansunilraman Posts: 8,133member
    (from Wikipedia) image:



  • Reply 248 of 565
    jeffdmjeffdm Posts: 12,953member
    Quote:

    Originally posted by sunilraman

    "Any new dual- or quad-core Power Mac G5 supports two Apple Cinema Displays, including dual-link DVI for one 30-inch model. Support for two 30-inch Apple Cinema HD Displays requires two dual-link DVI ports, available in configurations with the NVIDIA Quadro FX 4500 or by installing an additional NVIDIA GeForce 6600 card. Support for more than two displays requires installation of one or more additional NVIDIA GeForce 6600 cards." [/B]



    If one of the 6600 models are passively cooled, I'd have no problem doubling up if I wanted dual 30" screens. That Quadro takes two slots of space anyway. Maybe the Quadro's cooling is quiet, but just looking at it makes me wonder how loud that thing screams.
  • Reply 249 of 565
    sunilramansunilraman Posts: 8,133member
    Yeah, here's a Galaxy 6600LE with two dual-link DVI outs, plus an s-Video out as well for good measure. (Image



  • Reply 250 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by JeffDM

    If one of the 6600 models are passively cooled, I'd have no problem doubling up if I wanted dual 30" screens. That Quadro takes two slots of space anyway. Maybe the Quadro's cooling is quiet, but just looking at it makes me wonder how loud that thing screams.






    The Asus Silencer looks good for 6600 quiet operation. Not sure if you can just bang it into a PowerMac though...









    Or you could get the 6600 from Apple, for full compatibility, and then bring on the Zalman UltraQuiet Heatsink and Fan: It would be virtually silent It does look quite thin so it should fit in one PCIExpress slot.



    http://www.zalman.co.kr/eng/product/...x=201&code=013



  • Reply 251 of 565
    sunilramansunilraman Posts: 8,133member
    Plus isn't it time you PowerMac G5 people had some NEON in your rigs?



    (image)

  • Reply 252 of 565
    sunilramansunilraman Posts: 8,133member
    Okay, cluttering this thread up with offtopic video card stuff, so final linky about installing an older Zalman cooler onto a Powermac G5 6800card:



    http://www.xlr8yourmac.com/Graphics/...lmanVF700.html
  • Reply 253 of 565
    melgrossmelgross Posts: 33,600member
    The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.
  • Reply 254 of 565
    sunilramansunilraman Posts: 8,133member
    ROFL It's true. The spec sheet of the MacPro as circulated by AppleInsider at the moment shows only an X1800 GTO on the higher end... (http://www.appleinsider.com/article.php?id=1886). I mean, come on, "ATI X1800 GTO" - okay, there is an X in the X1800 but the words that come after the numbers has NO X IN IT! OMFG!



    But you are right. If the Mac Pro is going ATI then there should be a option for the Radeon X1900 XTX ! It has 2 X's in there after the numbers!!

    Hmmph. The X1800GTO is the lowest card in the X1800 range, BTW.
  • Reply 255 of 565
    zandroszandros Posts: 537member
    Quote:

    Originally posted by melgross

    The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.



    The GTs are second place, just after the GTX/Ultra variants. The GX2s are a special case. But that's for nVidia. Second place for ATi is XT, with first being the XTX, and GT being pretty low.



    Oh, the confusion.



    You can entertain yourself by looking at these charts:



    http://en.wikipedia.org/wiki/Compari...ocessing_Units

    http://en.wikipedia.org/wiki/Compari...ocessing_Units
  • Reply 256 of 565
    sunilramansunilraman Posts: 8,133member
    Yeah... Wrapping one's head around GPU naming schemes is fun, fun fun.



    nVidia, in descending order:

    7900GX2

    7900GTX

    7900GT

    7900GS

    7600GT

    7600GS

    ......



    ATI, in descending order:

    X1900 XTX \t

    X1900 XT \t

    X1900 GT

    X1800 XT \t

    X1800 XL \t

    X1800 GTO

    .......
  • Reply 257 of 565
    sunilramansunilraman Posts: 8,133member
    [QUOTE]Originally posted by melgross

    The GT models aren't that great. They are decent mid=range cards, but that's all you can say. If you want performance from any manufacturer, you need something with at least an "X" in the name.






    That said, I've been very happy with my nVidia 6600GT for the past year, and it's got another year of life in it. For playing the latest games at medium settings, with some antialiasing and quite a bit of anisotropic filtering.



    The GTX-type variants are not worth the money, IMO, for example, going for a 7900GTX is more for bragging rights and feeling good about yourself if you have the cash to blow. Same with the 6800 Ultra when the 6 series was happening. Overkill, for mainstream, fun and fluid gaming.



    I'll have to update myself on benchmarks of the nVidia 7900GT and 7600GT, but there's value for money there, I reckon.
  • Reply 258 of 565
    placeboplacebo Posts: 5,767member
    I just want to be able to choose the best PC GPU, from whichever manufacturer I choose, at a price that isn't clearly exorbitant.



    Right now, raising the 6600GT to a 7800GT costs $350.



    On Newegg, a 7800GT costs under $300, period. The 6600GT is about $125.



    So effectively, when you configure a Powermac with a 7800GT (and they don't even offer the GTX which is a pity), you are paying almost $500 for a card that costs under $300 off-the-shelf.



    I sincerely hope this changes with the Mac Pro.
  • Reply 259 of 565
    chuckerchucker Posts: 5,089member
    I'm hoping Apple finds a way to implement this that neither compromises EFI features, nor being able to pick any off-the-shelf card.



    Perhaps they can convince ATi and/or nVidia to simply provide replacement firmwares for download?
  • Reply 260 of 565
    Quote:

    Originally posted by Chucker

    I'm hoping Apple finds a way to implement this that neither compromises EFI features, nor being able to pick any off-the-shelf card.



    Perhaps they can convince ATi and/or nVidia to simply provide replacement firmwares for download?




    The firmware in the non EFI cards may be too small
Sign In or Register to comment.