What will be the new specs for the next PM line?

1235715

Comments

  • Reply 81 of 281
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by onlooker

    I can understand that, and why lemon insano is freakin', but wasn't apple the first to use OGL outside of SGI? I think they should have some serious experience with OGL. Were also forgetting the guys from raycer graphics. Those guys were supposed to be the OGL dev's from heaven. Didn't they design the Quarts engine?



    I think that ATI thing is an ATI issue with Mac's, but cinni-biache beta is an awful test.




    Why dont' you tell us about a better test onlooker.
  • Reply 82 of 281
    I'm not happy about this at all.



    IF the 6800 Ultra supports full Pro 3D drivers then surely Apple would mention that and it would be marketed as a Quadro of 'pro' card.



    or...



    It would be quite easy to re-badge and activate via OS X as a Quadro card and driver.



    Apple can't be 2300 Open GL points behind PC card equivalent. Hell, Nvidia cards are better at Open GL than Ati cards, aren't they? Don't Nvidia have ex-SGI staff pumping out their excellent GL drivers?



    Surely Apple could draw on that to have excellent GL implementation..?



    It's not the cpu. Dual 2.5s see off everything(!) in rendering bar ridiculously overclocked Xeons and Quad Xeons...and the G5 is often giving away upto 1.4 gig(!) on each cpu.



    It's not the physics test either. Here the dual 2.5 gig and 1.25 gig bus thrape the opposition AGAIN!



    But how, HOW can the 'same' card be over 100% slower?



    1. Bad Apple GL drivers?

    (When people have said Apple have a solid, if not outstanding, GL support?)

    2. Panther is slower than Windows XP at Open GL

    (do we seriously believe that considering how snail like Explorer is versus Safari?)



    Software is WHAT Apple does. They are just about the best. Even IF they are slower, they are always stylish and there's no way Apple is over 100% slower than the opposition in anything..? So I don't see the above, in theory, being the bug bear.



    3. Cinebench. Beta. PC optimised.

    (Okay. But can that explain the over 100% crippling of a Mac graphic card which is essentially the same as the PC card? Especially when the CPU and bandwidth do excellently well in the other tests...UNLESS, UNLESS the GL acceleration is the last bit of bench that is not optimised...can anybody get the email address of Maxxon/Cinebench test authors so I can email them about this? I feel so strongly about this. This issue of Mac graphic cards being slower than PC counterparts have p*********** me off for years...)

    4. Games. It's been noted, consistently, that Mac graphic card do not perform as well as the PC equiv' on 'hot' games.

    (Well, for a long while, the 'PC ports' have been CPU bound on the Mac with the lame G4 trying to power the ATI cards onward. BUT, SURELY with the G5, at 2.5 gig (easily equivalent of a 3-3.2 gig Pentium 4!), and there's two of them, on 1.25 gig bandwidth, using an industry standard AGPx8 pipe...SURELY this is now a NON-ISSUE..?!?!



    What is causing the problem?



    IF it IS a software problem related to Apple giving a full GL implementation which slows Apple's GL down or appear slower...then...erm...this IS a 3D test and Apple's GL should shine in this very bench....



    ...perhaps they should split the GL calls into simple implementation for games and precise calls for 3D pro modelling apps.



    However, the ATI x800 XT card presumably has simple GL? ie not full Fire or Quadro drivers? So why, WHY does it do so well in a 3D GL stress test? Modern consumer cards DO fuller GL implentations than in the past where cards such as the Voodoo had 'mini GL' for games. But times have moved on since then. The gap has closed significantly. The consumer cards are now ridiculously powered and have fuller GL drivers.



    So much so that Nvidia once proudly posted benches on their site a while back showing how one of their Geforce cards blew a Wildcat out the water. Was a few years back though.



    Are there any benches on Quadro vs 6800 Ultra?

    Mac vs PC ATI 9800sXT?



    Accelerate your Mac? Do they have any dirt to dish on the Mac's dirty secret?



    Nothing to do with fast writes not being supported on the Mac? I thought the announcement of the Nvidia Geforce 2MX hurdled the issue of fast writes way back...



    Still scratching my head...I can't see what it is...



    In games, Macs used to be CPU bound.

    Are G5 rigs STILL cpu bound? Do G5s now achieve parity with the same card in games?



    If not, we know it's not just the Cinebench beta that is fishy.



    And nobody said it was an Endian issue.



    Where's the bottleneck?



    Lemon Bon Bon \
  • Reply 83 of 281
    hmurchisonhmurchison Posts: 12,431member
    I'm thinking a bit that Apple is focusing their efforts on getting OpenGL 2.0 in Tiger. The current OpenGL isn't bad but it's still not as speedy as it could be. OpenGL is important to Apple and particularly the new extensions and shader support. Tiger should focus on the implementation moreso than Jaguar or Panther IMO.



    I'm almost wondering if Apple does indeed have plans to add the Quadro line but simply is going to wait until they can support PCI Express. If I was Apple I'd do this. It wouldn't hurt for sales of PCI Express Powermacs and I could center my driver code around PCI Express not that it requires a lot of changes.



    Rumors had it that Apple and 3Dlabs were talking about the Wildcats so we know Apple is indeed looking for bigger iron. Well Apple will have more incentive after they buy Luxology



    It's amazing but we're still talking first generation hardware with today's Macs. I'm ready to see the successor and what suprising stuff may show up. I have faith in Apple that they are going to beef up the hardware soon. They made a nice splash at Siggraph and will look to extend that good will in the future.
  • Reply 84 of 281
    Good post.



    My thinking is Tiger is an opportunity to beef up their software and hardware specs for GL.



    Open GL 2. A faster speed. Part of Tiger. Antares.



    Cards. Quadro. Geforce whatevers also by then. 512 meg cards maybe a reality by then.



    And what of a stackable SLI option for Mac Nvidia customers?



    IF Apple is going to play with the workstation crowd...then surely 3D is the next natural step for a company with workstations in a closely affiliated Steve Jobs company...



    Wouldn't they (Pixar) want Quadros too?



    As for Luxology?



    Why not buy both them and Newtek?



    Newtek are deeply engrained in broadcast graphics.



    It's the final piece of Apple's workstation line of software?



    That or XSI version 4?



    And Maya was just spun off solo for a reported $26 million? Surely not? Apple would have snapped 'em up at this price...?



    Either way, Mac graphic cards have got to get better than those GL benches. It's...surprisingly, the weakest link in an otherwise great workstation.



    Antares can't come soon enough. I want my 3 gig plus Mac.



    And I want it now.



    Next Spring seems a long time to wait. And they'd better get quick to it with a Wild Cat or Quadro...



    Lemon Bon Bon
  • Reply 85 of 281
    hmurchisonhmurchison Posts: 12,431member
    Accel-KKR purchased Alias for 56 Million



    Perhaps a bit spendy for Apple. Most of Alias' tools are PC. I'm sure Apple doesn't have a problem with a few PC apps but having too many means more work for them. Luxology wouldn't have this problem. Just OSX and Windows and 2 products max. Luxology and Newtek would be like inviting the Hatfields and the McCoys to the same party..ouch.



    Apple needs some 3D brain power in the company. They get that cheaply with Luxology and Stuart Ferguson, Allen Hastings and Brad Peebler. By OSX 10.5 Apple needs a Core3D API for developers to start integrating 3D into applications.



    Powermacs with PCI Express should support Nvidia SLI. All you need is 16x PCIe and 8x. The space issue and power requirements seem to be bigger potential issues.



    I think the nextgen Powermacs will have either two optical drives or two additional drive bays. My money is on the two drive bays.



    I'd like to see dual gigabit ports as well with trunking. Gigabit is now ready for consumer computers so the next step for workstations is trunked gigabit and multiple ports.



    And last but not least how about a fanless Powersupply to cut noise down drastically.
  • Reply 86 of 281
    rhumgodrhumgod Posts: 1,289member
    Quote:

    Originally posted by hmurchison

    I'd like to see dual gigabit ports as well with trunking. Gigabit is now ready for consumer computers so the next step for workstations is trunked gigabit and multiple ports.



    I can see dual gigabit, but trunking is a switch level thing. Why incorporate that into the computer?
  • Reply 87 of 281
    hmurchisonhmurchison Posts: 12,431member
    Quote:

    Originally posted by Rhumgod

    I can see dual gigabit, but trunking is a switch level thing. Why incorporate that into the computer?



    Saves using a PCI slot later to add trunking features and creates product differentiation between consumer lines which should have Gigabit standard IMO at $1k and above and workstations which should have dual gigabit thus enabling trunking capability with the right switch and software(Tiger Server). I'm looking long term at the probablity of fast switches and $499 Xsan clients linked via multiple trunked gigabit connections. Cheaper than fiber channel either way you look at it.



    Granted it is a bit frivilous but sometimes the high end needs to be impractical in ways.
  • Reply 88 of 281
    onlookeronlooker Posts: 5,252member
    Quote:

    Core3D API



    Dude now your talking my language! That is the best freaking idea I've ever heard.





    Quote:

    Originally posted by hmurchison





    Apple needs some 3D brain power in the company. They get that cheaply with Luxology and Stuart Ferguson, Allen Hastings and Brad Peebler. By OSX 10.5 Apple needs a Core3D API for developers to start integrating 3D into applications.



    Powermacs with PCI Express should support Nvidia SLI. All you need is 16x PCIe and 8x. The space issue and power requirements seem to be bigger potential issues.



    I think the nextgen Powermacs will have either two optical drives or two additional drive bays. My money is on the two drive bays.



    I'd like to see dual gigabit ports as well with trunking. Gigabit is now ready for consumer computers so the next step for workstations is trunked gigabit and multiple ports.



    And last but not least how about a fanless Powersupply to cut noise down drastically.




    I just love this post.





    Apple buying Luxology would be a wet dream come true. And that has actually happened to me before.
  • Reply 89 of 281
    emig647emig647 Posts: 2,455member
    Lemon Bon Bon,



    I saw you comparing Xeon to g5's earlier. People keep doing this and I do NOT know why. The top PC proc is the Opteron 150 and 200. The 150 smoked the Xeon 3.6ghz in all tests done by anandtech. It averaged about 2 times the speed. The opteron 150 is 200-300 dollars cheaper than the Xeon 3.6 (when it comes out) and its faster. You can buy quad proc mobos for the opteron as well. I just wanted to point out that you should be comparing the g5 with the opteron (also a 64bit proc) instead of the Xeon (a 32bit proc). And yes the opteron is ... don't say it emig ... gulp ... faster than the g5 in all tests.



    Ibm is doing a good job in my opinion. It is on par with the Xeon but unfortunately AMD has us by the nuts.
  • Reply 90 of 281
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by emig647

    Lemon Bon Bon,



    I saw you comparing Xeon to g5's earlier. People keep doing this and I do NOT know why. The top PC proc is the Opteron 150 and 200. The 150 smoked the Xeon 3.6ghz in all tests done by anandtech. It averaged about 2 times the speed. The opteron 150 is 200-300 dollars cheaper than the Xeon 3.6 (when it comes out) and its faster. You can buy quad proc mobos for the opteron as well. I just wanted to point out that you should be comparing the g5 with the opteron (also a 64bit proc) instead of the Xeon (a 32bit proc). And yes the opteron is ... don't say it emig ... gulp ... faster than the g5 in all tests.



    Ibm is doing a good job in my opinion. It is on par with the Xeon but unfortunately AMD has us by the nuts.




    Xeon still renders faster though.
  • Reply 91 of 281
    emig647emig647 Posts: 2,455member
    Renders faster in what program? Every test I have seen the opteron 150 smashes the Xeon.



    Example:

    http://www.anandtech.com/linux/showdoc.aspx?i=2163&p=4



    Looks to me the opteron 150 is significantly ahead!



    Where's your proof?



    Look at all of those benchs. The Xeon doesn't come close to the opteron in any of those tests... you can't convince me without proof the Xeon renders faster.
  • Reply 92 of 281
    pbpb Posts: 4,255member
    Quote:

    Originally posted by Lemon Bon Bon

    2000(!) Open GL points and over beind an x800XT card on the pc.



    The Mac's best card is projected at 1,700 on a DUAL 2.5 gig?



    WHAT THE HELLL IS GOING ON!?!?!?





    Apart all factors already discussed here, maybe this discussion would serve as a hint on what else could happen.
  • Reply 93 of 281
    emig647emig647 Posts: 2,455member
    PB,



    ROFLMAO!!!!!!



    I'm sure that has a pretty big impact. And as you noticed a lot of those machines in cinebench were overclocked... if someone has the wisdom to overclock a pc cpu I'm sure they are smart enough to use coolbits or powerstrip and overclock their GPU's also.



    I'm kind of insulted because I bought a 9600xt. If they did that to thte 9600xt i'm sure they did it to the 9800xt too. I wish someone could verify those clock speeds. Really the 6800 ultra is the only card that was worth upgrading to. If I was a gamer or a diehard graphics nut i would have got one and waited. Oh well. Thats what pc's are for.
  • Reply 94 of 281
    amorphamorph Posts: 7,112member
    Quote:

    Originally posted by Lemon Bon Bon

    I'm not happy about this at all.



    IF the 6800 Ultra supports full Pro 3D drivers then surely Apple would mention that and it would be marketed as a Quadro of 'pro' card.




    It's up to NVIDIA to say that it's a GeForce or a Quadro. If they don't ship Apple the firmware, test and approve the driver + OpenGL + card + hardware with the major applications, etc., they're not going to call it a Quadro. If Apple tries, they'll jump down Apple's throat.



    Quote:

    Apple can't be 2300 Open GL points behind PC card equivalent. Hell, Nvidia cards are better at Open GL than Ati cards, aren't they? Don't Nvidia have ex-SGI staff pumping out their excellent GL drivers?



    Apple's NVIDIA drivers are ported PC code (and they started out really shaky, unlike NVIDIA's rock-solid PC drivers), and NVIDIA's SGI veterans are probably working on the full OpenGL implementation, which Apple doesn't use because they have their own.



    Quote:

    1. Bad Apple GL drivers?

    (When people have said Apple have a solid, if not outstanding, GL support?)




    This is certainly possible. OpenGL is absolutely immense, so optimizing it is no mean feat. In the mean time, as I've been saying, completeness and accuracy trump speed in the pro space.



    As for Cinebench, I'd just write that off and look at the performance of 3D apps in actual use. I have a sneaking suspicion that it's not unlike cross-platform Premier benchmarking...



    Quote:

    4. Games. It's been noted, consistently, that Mac graphic card do not perform as well as the PC equiv' on 'hot' games. (Well, for a long while, the 'PC ports' have been CPU bound on the Mac with the lame G4 trying to power the ATI cards onward. BUT, SURELY with the G5, at 2.5 gig (easily equivalent of a 3-3.2 gig Pentium 4!), and there's two of them, on 1.25 gig bandwidth, using an industry standard AGPx8 pipe...SURELY this is now a NON-ISSUE..?!?!



    Un-optimized code — or worse, code optimized for a completely different platform — can squander any potential hardware advantage completely. Given that the Mac games market is not generally seen as lucrative by the larger publishers, they don't spend a lot of money on the port, and so you get code optimized for a Pentium 4 running on something it was never designed to run on, and maybe you get lucky and it's not a big deal, and maybe you get hammered. A lot of the tricks for speeding up PPCs involve cache hinting instructions and other processor- and ISA-specific tricks; the PC game code might use the P4's analogs, and the port might strip them out and replace them with nothing, for want of time to do proper profiling and optimizing and (if necessary) refactoring.



    Quote:

    However, the ATI x800 XT card presumably has simple GL? ie not full Fire or Quadro drivers? So why, WHY does it do so well in a 3D GL stress test?



    Partly because Carmack has been breathing down their necks, and partly because DirectX is getting remarkably full-fledged itself, to the point where the OpenGL ARB has basically released 80% of OpenGL 2.0 already (as extensions) just to keep up.



    It used to be true that pro cards were completely different architectures. Now that isn't. Then it used to be true that they were faster. Now it isn't. Now it's true that their OpenGL implementations are more complete. Eventually, it won't be. The pro 3D card market is going into the same death spiral that the pro workstation ($10k+ UNIX desktops) has been in. Nevertheless, for now, it's still true that you are guaranteed a complete OpenGL with the pro cards.



    Otherwise, we wouldn't be having this discussion, would we? onlooker and others would be happily working on their high-end consumer GPUs while Quadro owners looked on in envy.



    Quote:

    Are there any benches on Quadro vs 6800 Ultra?

    Mac vs PC ATI 9800sXT?




    If the bench doesn't note differences in precision, and also any tests that one card can run and the other can't, then it's not particularly meaningful. If it uses an app like Maya that intentionally withholds densely optimized OpenGL code from consumer cards, it's even less meaningful, because the 6800 (e.g.) will be twiddling its thumbs.
  • Reply 95 of 281
    i have it on good info, that it will be a



    Dual G5 4GHz

    2-4GB 667MHz DDR2

    8x Dual Layer DVD-RW

    Silent Water Cooling

    ATi X800pro standard

  • Reply 96 of 281
    hmurchisonhmurchison Posts: 12,431member
    Quote:

    i have it on good info, that it will be a



    WTF???? No 10G Ethernet! No buy!
  • Reply 97 of 281
    Quote:

    i have it on good info, that it will be a



    Dual G5 4GHz

    2-4GB 667MHz DDR2

    8x Dual Layer DVD-RW

    Silent Water Cooling

    ATi X800pro standard



    When?



    WHEN!?



    Lemon Bon Bon \
  • Reply 98 of 281
    davegeedavegee Posts: 2,765member
    Quote:

    Originally posted by industryMan

    i have it on good info, that it will be a



    Dual G5 4GHz

    2-4GB 667MHz DDR2

    8x Dual Layer DVD-RW

    Silent Water Cooling

    ATi X800pro standard





    You forgot to add dual bonded Ultra Wide Band and 16x HD-DVD quad layer blue laser.
  • Reply 99 of 281
    @homenow@homenow Posts: 998member
    Quote:

    Originally posted by Lemon Bon Bon

    When?



    WHEN!?



    Lemon Bon Bon \




    2006...Come on, IBM hasnt even shipped a 3 Ghz, and just reciently started shipping 2.5's after a bit of a delay.
  • Reply 100 of 281
    onlookeronlooker Posts: 5,252member
    Yeah, but when is Tiger due anyway? Last I recall SJ said it was like a year, and a day away.
Sign In or Register to comment.