Apple to sell scaled-down version of Nvidia 6800 DDL

Posted:
in Current Mac Hardware edited January 2014
Nvidia's 6800 GT graphics card will soon be available from Apple's online store, providing an alternative option to customers seeking a new 30-inch HD Apple Cinema Display.



Apple Computer will begin selling a scaled down version of the high-end Nvidia 6800 Ultra DDL graphics card in order to better meet demand for the product, according to a MacNN report.



One of the Nividia 6800 GPUs are required to power Apple's new 2560x1600 pixel 30-inch HD Apple Cinema Display, which made its debut in July. But the graphics card company has been unable to produce stable versions of the 6800 Ultra card, delaying orders for Apple's 30-inch display by up to three months.



"Both products feature 256MB of GDDR3 memory and 16-pipe superscalar architecture; however, the Nvidia 6800 GT features a slightly lower bandwidth of 32GB/sec for delivery of 525 million verticies and 5.6 billion textured pixels per second. The Ultra version features 35.2GB/sec throughput for 600 million vertices and 6.4 billion textured pixels per second," the report says.



Like the 6800 Ultra, the 6800 GT will block the adjacent PCI slot, reducing the number of available PCI or PCI-X slots from three to two on the Power Mac G5.



A message to Apple customers says the NVIDIA GeForce 6800 GT DDL graphics card will be available for order in early November. The company had previously informed customers that it hoped to ship orders for the 6800 Ultra and 30-inch display "on or before October 25th" (today).
«1

Comments

  • Reply 1 of 23
    Does anyone else see the loss of a PCI slot as being as bad a thing as I do? I mean, if you had five or so and blocked one, big deal. But cutting three to two seems like a bad idea.
  • Reply 2 of 23
    louzerlouzer Posts: 1,054member
    Quote:

    Originally posted by benzene

    Does anyone else see the loss of a PCI slot as being as bad a thing as I do? I mean, if you had five or so and blocked one, big deal. But cutting three to two seems like a bad idea.



    Well, its only a 'bad' thing if you need all three. Or you're one of those "I have three free slots, and if I install this card, I'll only have two. Two is less then three. So it must be a bad thing" people, regardless of the fact that you've never put a card in your computer ever.



    My big question is how well (or how slowly or how differently) does these new cards run the 30" display?
  • Reply 3 of 23
    Quote:

    Originally posted by Louzer

    Well, its only a 'bad' thing if you need all three. Or you're one of those "I have three free slots, and if I install this card, I'll only have two. Two is less then three. So it must be a bad thing" people, regardless of the fact that you've never put a card in your computer ever.



    My big question is how well (or how slowly or how differently) does these new cards run the 30" display?




    Probably the same in 2D, since they both have 256MB memory. The only difference will be in 3D performance, because of the reduced clock speeds.
  • Reply 4 of 23
    adamraoadamrao Posts: 175member
    Other than to provide supply, what is the real purpose of getting the 6800 GT? Now that the Radeon X800 is available for the Mac, why would you go with NVidia anyway? Is there a particular reason for it? Unless it's price... but if you're running a dual-processor PowerMac G5 with a 30" Cinema Display, I highly doubt that price is an obstacle.



    The ATI card doesn't take up any extra space like the NVidia cards do, runs the 30" and ATI drivers are updated at the same pace as NVidia's with each Mac OS X update. I guess I just don't see the advantage to buying the NVidia card now.



    With the obvious delays of the 6800 DDL, the GT is there to get them out the door and have people running their 30" displays before people start dropping their orders and such, but IMO, the ATI card is the way to go if you're going to be running one of those monsters. Beautiful monsters, that is.
  • Reply 5 of 23
    placeboplacebo Posts: 5,767member
    Quote:

    Originally posted by adamrao

    Other than to provide supply, what is the real purpose of getting the 6800 GT? Now that the Radeon X800 is available for the Mac, why would you go with NVidia anyway? Is there a particular reason for it? Unless it's price... but if you're running a dual-processor PowerMac G5 with a 30" Cinema Display, I highly doubt that price is an obstacle.



    The ATI card doesn't take up any extra space like the NVidia cards do, runs the 30" and ATI drivers are updated at the same pace as NVidia's with each Mac OS X update. I guess I just don't see the advantage to buying the NVidia card now.



    With the obvious delays of the 6800 DDL, the GT is there to get them out the door and have people running their 30" displays before people start dropping their orders and such, but IMO, the ATI card is the way to go if you're going to be running one of those monsters. Beautiful monsters, that is.




    Because maybe many people (Graphic designers in particular) aren't into games, and don't want anything more than the minimum card to drive their displays? Also, the ATI won't allow for dual 30 inch displays; just one 30" and one ADC/Normal DVI display.



    Although yes, the ATI card will most likely have a better price : performance ratio and free up another slot.
  • Reply 6 of 23
    eugeneeugene Posts: 8,254member
    Placebo, other than baseless conjecture on your part, why would you have any reason to believe an x800 couldn't power two 30" Apple Cinema HD displays?
  • Reply 7 of 23
    slugheadslughead Posts: 1,169member
    Quote:

    Originally posted by benzene

    Does anyone else see the loss of a PCI slot as being as bad a thing as I do? I mean, if you had five or so and blocked one, big deal. But cutting three to two seems like a bad idea.



    I'm really annoyed about this.



    I could've really used that 3rd slot for a hard drive controller for use with that 3rd party thing.



    So my $3000+ computer is limited to:

    ? 2 PCI cards

    ? 2 hard drives (sans the 3rd party upgrade which I can't use)



    Thank God it has optical audio otherwise I'd be in a real bind.
  • Reply 8 of 23
    slugheadslughead Posts: 1,169member
    Quote:

    Originally posted by Placebo

    Because maybe many people (Graphic designers in particular) aren't into games, and don't want anything more than the minimum card to drive their displays? Also, the ATI won't allow for dual 30 inch displays; just one 30" and one ADC/Normal DVI display.



    Although yes, the ATI card will most likely have a better price : performance ratio and free up another slot.




    The x800 WILL power the 30" display.



    The x800 WILL probably be cheaper.



    The x800 WILL be faster than the 6800 (U or GT) for non OGL 3d tasks.



    The x800 WILL be faster at core image.



    However, the x800 will not be out for probably 6 months, the 6800 is out now.
  • Reply 9 of 23
    louzerlouzer Posts: 1,054member
    Quote:

    Originally posted by Eugene

    Placebo, other than baseless conjecture on your part, why would you have any reason to believe an x800 couldn't power two 30" Apple Cinema HD displays?



    As stated on xlr8yourmac.com:

    Quote:

    The X800 shown had both one dual link DVI port and an ADC port, where the 6800 Ultra has two dual link DVI ports but no ADC port. (So it would require an external DVI to ADC display adapter w/P.S., although the 6800 can drive two 30in Cinemas at once.)



    http://www.xlr8yourmac.com/archives/oct04/101804.html



    Since it doesn't have two dual-link DVI ports, it can't run two 30" displays.



    Note that this was from a 'tech preview' and not official. They haven't even offered specs (so how someone can say it'll be faster is beyond me) nor said when it might be available. So I wouldn't get your panties all bunched up over this (and remember that, on the PC, this is mainly shown as a PCI-Express card, as opposed to AGP, so PC specs aren't going to compare directly unless its on the AGP version of the cards).
  • Reply 10 of 23
    louzerlouzer Posts: 1,054member
    Quote:

    Originally posted by slughead

    The x800 WILL power the 30" display.



    The x800 WILL probably be cheaper.



    The x800 WILL be faster than the 6800 (U or GT) for non OGL 3d tasks.



    The x800 WILL be faster at core image.



    However, the x800 will not be out for probably 6 months, the 6800 is out now.




    Where did you find out the specs here on the x800, esp. Core Image speed, price, and non-3d tasks?
  • Reply 11 of 23
    slugheadslughead Posts: 1,169member
    Quote:

    Originally posted by Louzer

    Where did you find out the specs here on the x800, esp. Core Image speed, price, and non-3d tasks?



    The card has much more power than the 6800 based on PC benchmarks. There has never been a discrepancy on PC Vs Mac card performance that was big enough to unbalance this kind of supremacy. If both cards are proportionally slower than their PC counterparts (and they will be), then the X800 will be faster.



    http://www.tomshardware.com/



    The price was a "probably" because I don't know 100%. However, I do know that ATi would never sell the thing for $600 :X
  • Reply 12 of 23
    placeboplacebo Posts: 5,767member
    Quote:

    Originally posted by Eugene

    Placebo, other than baseless conjecture on your part, why would you have any reason to believe an x800 couldn't power two 30" Apple Cinema HD displays?



    'Cuz I'm in the loop, and you're not.
  • Reply 13 of 23
    auroraaurora Posts: 1,142member
    Quote:

    Originally posted by slughead

    The card has much more power than the 6800 based on PC benchmarks. There has never been a discrepancy on PC Vs Mac card performance that was big enough to unbalance this kind of supremacy. If both cards are proportionally slower than their PC counterparts (and they will be), then the X800 will be faster.



    http://www.tomshardware.com/



    The price was a "probably" because I don't know 100%. However, I do know that ATi would never sell the thing for $600 :X




    read a little closer or look at Doom3 benches. the X800 is not faster then the 6800 in fact the opposite also there are many reviews saying how the 6800GT is a better choice then the x800 which doesnt support directx9c. Lets not spin the facts also the 6800GT is a 100 bucks cheaper then the 6800UT and can be clocked to the same performance.In Apples world you will pay $100 more bucks to have the same card wether GT or UT now that really sucks. 6800GT is a best buy for performance vs dollar.
  • Reply 14 of 23
    What's missing from this article is whether Apple told customers they would get a better price on the scaled-down 6800.
  • Reply 15 of 23
    mattbmattb Posts: 59member
    Quote:

    Originally posted by slughead



    The x800 WILL be faster than the 6800 (U or GT) for non OGL 3d tasks.




    What non OpenGL tasks exactly? As far as I am aware, everything 3D on MacOS X is OpenGL including Quickdraw Extreme.



    Quote:

    The x800 WILL be faster at core image.



    I assume this would only be the case if Core Image didn't use OpenGL extensions. So far I've not seen any confirmation either way. I hope you are right since I plan to buy the x800 but I wouldn't place any bets on the x800 being faster.



    Quote:

    However, the x800 will not be out for probably 6 months, the 6800 is out now.



    If they wait that long they might as well not bother. Both these cards are well into their life now and surely new versions of each core will be released either late this year or early next.
  • Reply 16 of 23
    placeboplacebo Posts: 5,767member
    Quote:

    Originally posted by nebula

    What's missing from this article is whether Apple told customers they would get a better price on the scaled-down 6800.



    It'll be cheaper, without a doubt.
  • Reply 17 of 23
    slugheadslughead Posts: 1,169member
    Quote:

    Originally posted by Aurora

    read a little closer or look at Doom3 benches. the X800 is not faster then the 6800 in fact the opposite also there are many reviews saying how the 6800GT is a better choice then the x800 which doesnt support directx9c. Lets not spin the facts also the 6800GT is a 100 bucks cheaper then the 6800UT and can be clocked to the same performance.In Apples world you will pay $100 more bucks to have the same card wether GT or UT now that really sucks. 6800GT is a best buy for performance vs dollar.



    like I said, for Non-OGL 3D tasks the x800 is faster. Doom 3 is a pretty big OGL 3D task



    The 6800 GT is a best buy, that's why I own one for my PC. But if I really, really wanted performance for the games I play most often, I'd get an X800.



    http://graphics.tomshardware.com/gra...004/index.html



    Check out Farcry, BF Vietnam, and race driver 2, as well as the benchmarks if you want. The rest are all OGL-3D.
  • Reply 18 of 23
    mattbmattb Posts: 59member
    Quote:

    Originally posted by slughead

    like I said, for Non-OGL 3D tasks the x800 is faster. Doom 3 is a pretty big OGL 3D task



    The 6800 GT is a best buy, that's why I own one for my PC. But if I really, really wanted performance for the games I play most often, I'd get an X800.



    http://graphics.tomshardware.com/gra...004/index.html



    Check out Farcry, BF Vietnam, and race driver 2, as well as the benchmarks if you want. The rest are all OGL-3D.




    Those games you mention are all DirectX games. MacOS X does not have DirectX at all and if these games were ported to the Mac, they would have be changed to use OpenGL instead. Because of that, these benchmarks don't really tell us anything useful about the Mac version of the x800.



    Unless I am mistaken (someone please correct me if I am), the only hardware 3D rendering API in use on the Mac today is OpenGL and therefore OpenGL performance is the only thing that matters for a Mac video card. Unfortunately the x800 seems a little lacking in that area although how much of that is drivers, I don't know. Personally I'm hoping it is just drivers and that performance will be improved.
  • Reply 19 of 23
    Quote:

    Originally posted by MattB

    Those games you mention are all DirectX games. MacOS X does not have DirectX at all and if these games were ported to the Mac, they would have be changed to use OpenGL instead. Because of that, these benchmarks don't really tell us anything useful about the Mac version of the x800.



    Unless I am mistaken (someone please correct me if I am), the only hardware 3D rendering API in use on the Mac today is OpenGL and therefore OpenGL performance is the only thing that matters for a Mac video card. Unfortunately the x800 seems a little lacking in that area although how much of that is drivers, I don't know. Personally I'm hoping it is just drivers and that performance will be improved.




    I remember somebody posting here that Apple writes their own OpenGL implementation, so just because NVidia can tweak their PC OpenGL drivers to be faster at Doom III, doesn't necessarily mean that it will be faster on a Mac. In fact, seeing how the cards are roughly equal in DirectX, it seems to me that the biggest difference will be the drivers. OTOH, I seem to recall Alias having problems working well on ATI cards.
  • Reply 20 of 23
    mattbmattb Posts: 59member
    Quote:

    Originally posted by nguyenhm16

    I remember somebody posting here that Apple writes their own OpenGL implementation, so just because NVidia can tweak their PC OpenGL drivers to be faster at Doom III, doesn't necessarily mean that it will be faster on a Mac. In fact, seeing how the cards are roughly equal in DirectX, it seems to me that the biggest difference will be the drivers. OTOH, I seem to recall Alias having problems working well on ATI cards.



    I think it's true that Apple writes their own OpenGL implementation, but from what others here have said, ATI and NVidia give Apple the source code of their hardware level drivers to build the MacOS X drivers from. That would mean that these are largely the same as those used on Windows with whatever modifications Apple chooses to make.
Sign In or Register to comment.