Apple introduces Power Mac G5 Quad & Power Mac G5 Dual

2456789

Comments

  • Reply 21 of 176
    pbpb Posts: 4,248member
    Quote:

    Originally posted by Algol

    I mean what is the point of PCIe when the cards don't even need it?



    I don't know exactly how Quartz2D Extreme works. If it needs data from the GPU back to the CPU, then the bidirectional PCIe bus would make a big difference in graphics performance.
  • Reply 22 of 176
    melgrossmelgross Posts: 33,334member
    Quote:

    Originally posted by PB

    I don't know exactly how Quartz2D Extreme works. If it needs data from the GPU back to the CPU, then the bidirectional PCIe bus would make a big difference in graphics performance.



    The other thing, of course, is that PCI X is obsolete. Apple is just about the only one left using it. And even they aren't using it anymore.



    There may be low end cards supplied with the machine, but this isn't an iMac, you don't have to buy those cards. Get the 7800 when they list it instead. You can wait the short time.



    More powerful cards will never work on the AGP bus of the old PM. It only supplies 2Gbs to the card. Express supplies 4Gbs.



    Look at the old G4's the 4x AGP bus had only 1Gbs, and the 2x had 500Mbs.



    Newer gens of cards can't work on a bus that doesn't have the bandwidth, or they will be starved. Same thing happens if the cpu can't fill the video bus. So a balance is needed.



    There won't be other cards available either, such as SATA cards.
  • Reply 23 of 176
    Quote:

    Should they have supplied the 6600GT instead? Maybe, but it could have been out of their price target.



    Agreed.



    And nope.



    The 6600 GT is about £100. For Apple? Cheaper. And it about twice as fast as the suck egg 6600 vanilla.



    Same with x600 on iMac. The x600 is an improvement. But they could have reached for the x800.



    Apple seem to want to suck hard on GPUs. On the PowerMacs why do they suck hard with lame gpus on flagship machines? It's pathetic.



    Really, for the price?



    Still, as Melgross says, it's common knowledge that the 7800 GT kicks the snot out of a 6600. For 3d and games? 7800GT.



    It gives nigh on the performance of the Quadro listed on Apple's sights. Seems to push out more vertices...too...



    The Quad PowerMac gives 60% mark up in Lightwave benches.



    That's the biggest leap since Apple went from G4 to G5.



    It's time for Lemon Bon Bon to buy. I'll be getting one of these and sticking 4 to 8 gigs of ram in there.



    I won't order until the 7800Gt is listed. Hopefully, I'll have mine by X-mas!



    Ah, what the heck, I'll get a 30 inch LCD as well....at that price...too tempting...



    In the light of the original Dual G5 2 gig surprise a few years back? Is this Quad awesome? It should have been dual dual 3 gig. But who cares. It's about as good as we're going to get. We got 4 cpus. A quadro. PCI Express. 16 gig ram support. a terrabyte HD capac. I mean...what more can you ask for really? It's a hundred or so more. Big deal.



    The wait is over.



    Lemon Bon Bon



    PS. Onlooker can stop whining now. He has his pro-card. Congrats Apple...wut took yer so long?
  • Reply 24 of 176
    I'm thinking of upgrading my Dual 2.5, 2x250,6 gig, airport,bluetooth, 6800 Ultra machine. The quad looks to be awesome however I believe the 7800 to be better for Motion then the Quad card. What does everyone else think?
  • Reply 25 of 176
    jeffdmjeffdm Posts: 12,951member
    Quote:

    Originally posted by bjewett



    4 - Anyone know if most users need ECC memory? I have heard gripes about large Apple cluster systems because they did Not have ECC, so I think it is good to have as an option. For our typical single-desktop user ... is this a good idea?




    It's not really necessary, but I think it helps. With the computers I've owned, the ones with ECC RAM have *never* had stability problems when running Windows 2000 or NT.
  • Reply 26 of 176
    melgrossmelgross Posts: 33,334member
    Quote:

    Originally posted by Lemon Bon Bon

    Agreed.



    And nope.



    The 6600 GT is about £100. For Apple? Cheaper. And it about twice as fast as the suck egg 6600 vanilla.



    Same with x600 on iMac. The x600 is an improvement. But they could have reached for the x800.



    Apple seem to want to suck hard on GPUs. On the PowerMacs why do they suck hard with lame gpus on flagship machines? It's pathetic.



    Really, for the price?



    Still, as Melgross says, it's common knowledge that the 7800 GT kicks the snot out of a 6600. For 3d and games? 7800GT.



    It gives nigh on the performance of the Quadro listed on Apple's sights. Seems to push out more vertices...too...



    The Quad PowerMac gives 60% mark up in Lightwave benches.



    That's the biggest leap since Apple went from G4 to G5.



    It's time for Lemon Bon Bon to buy. I'll be getting one of these and sticking 4 to 8 gigs of ram in there.



    I won't order until the 7800Gt is listed. Hopefully, I'll have mine by X-mas!



    Ah, what the heck, I'll get a 30 inch LCD as well....at that price...too tempting...



    In the light of the original Dual G5 2 gig surprise a few years back? Is this Quad awesome? It should have been dual dual 3 gig. But who cares. It's about as good as we're going to get. We got 4 cpus. A quadro. PCI Express. 16 gig ram support. a terrabyte HD capac. I mean...what more can you ask for really? It's a hundred or so more. Big deal.



    The wait is over.



    Lemon Bon Bon



    PS. Onlooker can stop whining now. He has his pro-card. Congrats Apple...wut took yer so long?




    Vertices and triangles are very different beasts. A vertice is a much more easily calculated element. A triangle which a pro graphics cards are rated on take much more computing power.



    Also game cards such as the 7800 don't do well at all in a pro app. A $600 game card will always lose to a $600 pro card in a pro 3D app.
  • Reply 27 of 176
    noirdesirnoirdesir Posts: 1,027member
    Quote:

    Originally posted by imiloa

    Given that the 2.5 Quad is only 40-70% faster than the old 2.7 Dual, and assuming apple pushed the test limits for their marketing performance numbers, it seems likely a 970MP is max 75% faster than a single 970 at same clock speed.





    No, I think it means that the four processor (core) machine with 1MB of L2 cache under (probably) ideal conditions is about twice as fast a similiarly clocked dual processor machine (4*2.5 = 1.88*2*2.7, LinPack).



    (Insert the performance claims by Apple for the other apps in this formula to see how effective they run on quad machines.)
  • Reply 28 of 176
    Quote:

    Vertices and triangles are very different beasts. A vertice is a much more easily calculated element. A triangle which a pro graphics cards are rated on take much more computing power.



    I thought as much.



    But they didn't show the 'triangle' count for the 7800GT.



    There isn't a a pro card for the same price as a 7800GT on the Mac.



    The Quadro costs £1100. Ouch.



    Vs...I'm guessing, £300 for the 7800.



    In the gaming bench? Not much in it. There wasn't a lightwave bench for the two cards. But I question if the Quadro is £800 faster/better. But it's nice to have a Pro card for 'triangle/Anti-aliasing' grunt/finesse.



    Lemon Bon Bon
  • Reply 29 of 176
    neilwneilw Posts: 77member
    My beef is that they should have 2.3, 2.5, and dual 2.5 as the three models. I don't understand putting the large gap between 2.3 and dual 2.5...
  • Reply 30 of 176
    algolalgol Posts: 833member
    Quote:

    Originally posted by melgross

    Let me explain a little about what Apple is thinking when they decide which boards to supply.



    First of all Apple's machines are expensive comapred to PC cost/performance. Adding a more expensive to the base would raise the price wven further,



    Second, you have to know the mix of customers for these pro machines.



    The users doing 2D such as photo, graphics, audio, and publishing are perfectly served by the boards supplied. None need a fast 3D board. Most Powermacs are sold to this customer base. Putting a more expensive board in would just cause them to feel as though they were paying for something they don't need.



    The second and third groups are the gamers, for which a fast 3D board is something they will pay for, and have the option to, and the 3D pro's who, untli now, had only game boards to select from. A poor chioce for them.



    Should they have supplied the 6600GT instead? Maybe, but it could have been out of their price target.




    Dude you don't get my point. My point is that they don't supply anything instead. The 15000000 billion dollar card doesn't count. Why can't apple supply a descent card for 200 more or so? I don't need a 500 or 1000 or 1500 dollar GPU. However, there is notpoint in upgrading if I am going to get stuck with a card that is worse than the one I have, and a processor that may or may not be much faster. DDR2 really isn't a selling point for me.
  • Reply 31 of 176
    melgrossmelgross Posts: 33,334member
    Quote:

    Originally posted by Lemon Bon Bon

    I thought as much.



    But they didn't show the 'triangle' count for the 7800GT.



    There isn't a a pro card for the same price as a 7800GT on the Mac.



    The Quadro costs £1100. Ouch.



    Vs...I'm guessing, £300 for the 7800.



    In the gaming bench? Not much in it. There wasn't a lightwave bench for the two cards. But I question if the Quadro is £800 faster/better. But it's nice to have a Pro card for 'triangle/Anti-aliasing' grunt/finesse.



    Lemon Bon Bon




    For games, it's good, simply because it a top workstation card. But a $1,000 card would be bested by the 7800 or the ATI 1800.
  • Reply 32 of 176
    melgrossmelgross Posts: 33,334member
    Quote:

    Originally posted by Algol

    Dude you don't get my point. My point is that they don't supply anything instead. The 15000000 billion dollar card doesn't count. Why can't apple supply a descent card for 200 more or so? I don't need a 500 or 1000 or 1500 dollar GPU. However, there is notpoint in upgrading if I am going to get stuck with a card that is worse than the one I have, and a processor that may or may not be much faster. DDR2 really isn't a selling point for me.



    I get your point. Apple is supplying the 7800, which will go for a few hundred more. for some reason it isn't being shown in the store as yet, but they mention it several times on the site. You don't even need the 9800 if all you do is 2D. Those are the ones who will be happy with the 6600 series. I won't go from a 9800 to a card slightly better, I will go to the 7800 or equiv. Apple knows that.



    Look, let's face it, Apple just doesn't sell enough PM's for board makers to rush in supporting it.



    These machines won't be available for 3 or 4 weeks according to the store. Possibly by that time ATI will announce cards.



    I don't see the point in worrying now. We've waited so long that we can wait a bit longer. I'm not buying the quad until January. Let Apple get the kinks out. This is a totally new design for them. By January we may very well see more cards from Apple as well as third parties. I can't imagine that ATI will stand aside.
  • Reply 33 of 176
    Love the markup on RAM. 16GB for $9500....that's funny. Quick price search found it for $3200. Do they really expect people to not check and just overpay by 200%?
  • Reply 34 of 176
    algolalgol Posts: 833member
    Quote:

    Originally posted by melgross

    I get your point. Apple is supplying the 7800, which will go for a few hundred more. for some reason it isn't being shown in the store as yet, but they mention it several times on the site. You don't even need the 9800 if all you do is 2D. Those are the ones who will be happy with the 6600 series. I won't go from a 9800 to a card slightly better, I will go to the 7800 or equiv. Apple knows that.



    Look, let's face it, Apple just doesn't sell enough PM's for board makers to rush in supporting it.



    These machines won't be available for 3 or 4 weeks according to the store. Possibly by that time ATI will announce cards.



    I don't see the point in worrying now. We've waited so long that we can wait a bit longer. I'm not buying the quad until January. Let Apple get the kinks out. This is a totally new design for them. By January we may very well see more cards from Apple as well as third parties. I can't imagine that ATI will stand aside.




    Well we'll see if it is just a few hundred more. I'm thinking it'll be 400 at least.
  • Reply 35 of 176
    geobegeobe Posts: 235member
    If the overall CPU capabilities have now been bumped up 60-90%, does it mean that iChat could possibly support more video chat windows? Would it be possible that Apple gave it enough juice to support at least 5 video windows now?



    That would be true board room style conferencing.





    Lets all vote for SJ to be the next person on the apprentice after Martha Stewart.



    He would be awesome. If the rumors are true, he wouldn't have to change 1 single mgmt style.
  • Reply 36 of 176
    sunilramansunilraman Posts: 8,133member
    Quote:

    Originally posted by Lemon Bon Bon

    I thought as much.



    But they didn't show the 'triangle' count for the 7800GT.

    ....



    Lemon Bon Bon






    nvidia 6600, 6800, 7800 series is predominantly for games not pro 3D creation. so no triangle count there. just vertices
  • Reply 37 of 176
    boogabooga Posts: 1,081member
    Quote:

    Originally posted by geobe

    If the overall CPU capabilities have now been bumped up 60-90%, does it mean that iChat could possibly support more video chat windows? Would it be possible that Apple gave it enough juice to support at least 5 video windows now?





    On a dual-2.0GHz G5 on a cable modem, a 3 person conference (me and two other people) gets significantly "fuzzier". I don't think they really have the technology to do the existing 4 person that they offer today, let alone more.
  • Reply 38 of 176
    sunilramansunilraman Posts: 8,133member
    Quote:

    Originally posted by KidRed

    What's the 7800? Damn it I just order the 2.3 with the 6600!! How much better is the 7800 and how much more does it cost? Wondering if I should try and get my order changed?



    DUDE, DO NOT GET THE 6600. i have a 6600GT, which is better than the 6600, which i bought a few months ago for us $170.



    you want the 7800GT. note: which is not the same as 7800GTX but 7800GT i hear gives a nice balance of heat/power/performance/etc. and the 7800GT is an amazing fucking beast of a graphics card. if your after games and opengl performance, 7800GT. if you are actually doing a lot of pro 3d creation, then yeah get the quadro.



    wait a bit, i'm sure apple will have the 7800gt available soonish. or just call them up and see what they can do with your order.



    DO NOT GET THE 6600.
  • Reply 39 of 176
    algolalgol Posts: 833member
    Quote:

    Originally posted by sunilraman

    DUDE, DO NOT GET THE 6600. i have a 6600GT, which is better than the 6600, which i bought a few months ago for us $170.



    you want the 7800GT. note: which is not the same as 7800GTX but 7800GT i hear gives a nice balance of heat/power/performance/etc. and the 7800GT is an amazing fucking beast of a graphics card. if your after games and opengl performance, 7800GT. if you are actually doing a lot of pro 3d creation, then yeah get the quadro.



    wait a bit, i'm sure apple will have the 7800gt available soonish. or just call them up and see what they can do with your order.



    DO NOT GET THE 6600.




    /signed
  • Reply 40 of 176
    sjksjk Posts: 603member
    Quote:

    Originally posted by Jeff Leigh

    Love the markup on RAM. 16GB for $9500....that's funny. Quick price search found it for $3200.



    So, does that mean we can find 2GB DIMMs for the new iMac for ~$200, too?



    Quote:

    Do they really expect people to not check and just overpay by 200%?



    They probably add anyone who pays the Apple tax to a "special customer" (a.k.a. sucker) list.
Sign In or Register to comment.