What will be the new specs for the next PM line?

1911131415

Comments

  • Reply 201 of 281
    emig647emig647 Posts: 2,455member
    It does sound nice... too bad its far away
  • Reply 202 of 281
    rickagrickag Posts: 1,626member
    Quote:

    Originally posted by emig647

    It does sound nice... too bad its far away



    According to James @ Chipworks IBM is qualifying low dialectrics and will be in production soon.



    Quote from the article

    "They have been reported to be qualifying low-k during the last few months, and we are expecting to see low-k product soon..."



    As I understand it, low-k won't necessarily allow for large increases in clock speed, but lowers cross talk between metal traces, reduces some electron leakage and allows running the processors at lower voltages. So, speed bump, lower wattage requirements = ? and soon according to James @ Chipworks.



    If I have made some errors, will some of the more knowledgable AppleInsiders correct my errors.
  • Reply 203 of 281
    onlookeronlooker Posts: 5,252member
    I saw the chip works article too, but not being a CPU technical minded carbon based life form I really didn't know what it would mean for future PPC production. It all sounded promising, but They didn't mention what the #'s were before they started using the SSDOI (Strained Silicon Directly On Insulator) technique, and how much debris was present before they were using it.

    I was hoping this was the hurdle IBM needed to leap, and was the problem that the PPC had sense the moto G4 had speed, and scaling problems.



    That would the dream of a life time.
  • Reply 204 of 281
    Quote:

    Originally posted by Lemon Bon Bon

    I'm not happy about this at all.



    IF the 6800 Ultra supports full Pro 3D drivers then surely Apple would mention that and it would be marketed as a Quadro of 'pro' card.



    or...



    It would be quite easy to re-badge and activate via OS X as a Quadro card and driver.



    Apple can't be 2300 Open GL points behind PC card equivalent. Hell, Nvidia cards are better at Open GL than Ati cards, aren't they? Don't Nvidia have ex-SGI staff pumping out their excellent GL drivers?



    Surely Apple could draw on that to have excellent GL implementation..?



    It's not the cpu. Dual 2.5s see off everything(!) in rendering bar ridiculously overclocked Xeons and Quad Xeons...and the G5 is often giving away upto 1.4 gig(!) on each cpu.



    It's not the physics test either. Here the dual 2.5 gig and 1.25 gig bus thrape the opposition AGAIN!



    But how, HOW can the 'same' card be over 100% slower?



    1. Bad Apple GL drivers?

    (When people have said Apple have a solid, if not outstanding, GL support?)

    2. Panther is slower than Windows XP at Open GL

    (do we seriously believe that considering how snail like Explorer is versus Safari?)



    Software is WHAT Apple does. They are just about the best. Even IF they are slower, they are always stylish and there's no way Apple is over 100% slower than the opposition in anything..? So I don't see the above, in theory, being the bug bear.



    3. Cinebench. Beta. PC optimised.

    (Okay. But can that explain the over 100% crippling of a Mac graphic card which is essentially the same as the PC card? Especially when the CPU and bandwidth do excellently well in the other tests...UNLESS, UNLESS the GL acceleration is the last bit of bench that is not optimised...can anybody get the email address of Maxxon/Cinebench test authors so I can email them about this? I feel so strongly about this. This issue of Mac graphic cards being slower than PC counterparts have p*********** me off for years...)

    4. Games. It's been noted, consistently, that Mac graphic card do not perform as well as the PC equiv' on 'hot' games.

    (Well, for a long while, the 'PC ports' have been CPU bound on the Mac with the lame G4 trying to power the ATI cards onward. BUT, SURELY with the G5, at 2.5 gig (easily equivalent of a 3-3.2 gig Pentium 4!), and there's two of them, on 1.25 gig bandwidth, using an industry standard AGPx8 pipe...SURELY this is now a NON-ISSUE..?!?!



    What is causing the problem?



    IF it IS a software problem related to Apple giving a full GL implementation which slows Apple's GL down or appear slower...then...erm...this IS a 3D test and Apple's GL should shine in this very bench....



    ...perhaps they should split the GL calls into simple implementation for games and precise calls for 3D pro modelling apps.



    However, the ATI x800 XT card presumably has simple GL? ie not full Fire or Quadro drivers? So why, WHY does it do so well in a 3D GL stress test? Modern consumer cards DO fuller GL implentations than in the past where cards such as the Voodoo had 'mini GL' for games. But times have moved on since then. The gap has closed significantly. The consumer cards are now ridiculously powered and have fuller GL drivers.



    So much so that Nvidia once proudly posted benches on their site a while back showing how one of their Geforce cards blew a Wildcat out the water. Was a few years back though.



    Are there any benches on Quadro vs 6800 Ultra?

    Mac vs PC ATI 9800sXT?



    Accelerate your Mac? Do they have any dirt to dish on the Mac's dirty secret?



    Nothing to do with fast writes not being supported on the Mac? I thought the announcement of the Nvidia Geforce 2MX hurdled the issue of fast writes way back...



    Still scratching my head...I can't see what it is...



    In games, Macs used to be CPU bound.

    Are G5 rigs STILL cpu bound? Do G5s now achieve parity with the same card in games?



    If not, we know it's not just the Cinebench beta that is fishy.



    And nobody said it was an Endian issue.



    Where's the bottleneck?



    Lemon Bon Bon \




    I may be answering your questions a bit late, but there is a simple explanation.

    In the test your vere refering to (I think it's the one on http://www.3dfluff.com/mash/cbsingle.php, 8th line) , the G5 was running Mac OS 10.3.4, which didn't include any specific driver for the 6800 ultra.



    By the way, cinebench is not reliable for comparing graphic cards. Even between PCs, same graphics cards can produce different results.



    (English is not my first language)



    Bye



  • Reply 205 of 281
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by french macuser

    I may be answering your questions a bit late, but there is a simple explanation.

    In the test your vere refering to (I think it's the one on http://www.3dfluff.com/mash/cbsingle.php, 8th line) , the G5 was running Mac OS 10.3.4, which didn't include any specific driver for the 6800 ultra.



    By the way, cinebench is not reliable for comparing graphic cards. Even between PCs, same graphics cards can produce different results.



    (English is not my first language)



    Bye







    No it's not too late. We have been wondering what was with that test forever. Thanks. That explains a lot.



    we all thought that test was total fud.
  • Reply 206 of 281
    I don't know why, but I could not edit my first post. I just saw that the link I gave don't work.

    Here is the correct link for the benchmark page : http://www.3dfluff.com/mash/cbsingle.php
  • Reply 207 of 281
    Macosrumors has a mini-report that Apple have drafted a special team to work on implementing Open GL 2.0 to make Tiger THE GL OS.



    I hope so.







    As for GL in Apple's systems.



    Cinbench support indicated to me that Apple's GL version needs work.



    I'm figuring that an overhaul will come with 'Tiger'. Big time.



    And they need to. Beta Cinebench or not. Lagging by over 100% on the same card is not on.



    Not acceptable for the Mac platform where Apple charges you more for the same PC card.



    I know there are factors at play. But over 100% behind? Poor. I know when Apple do finally address the issues...they'll do it big time. They always do. 'Tiger' GL should blow us away.



    !



    The physics and rendering benches are quite the coupe.



    Hanging with quad cpu Xeons and overclocked Xeons.



    Giving away 500 mhz per cpu and still hanging with Xeons etc.



    The G5...can't wait to see it breach the 3 gig barrier and kick some ass.



    Low-K business. Hmmm. Do we smell Powerbook G5? Or merely better yields of existing FX 970 cpu?



    Antares. I think IBM/Apple should go breakneck for that.



    Not just 3 gig but upto 3.4 would be nice.



    9 months and counting.



    Antares. MP. Stick 4 gigs of ram in him. He's cooked. Let's do it!



    Lemon Bon Bon
  • Reply 208 of 281
    Will Apple bother bumping the Powermac to 2.8 in Jan 05?



    2, 2.5 and 2.8 gig duals? Immediate shipping?



    Suppose it would be a nice stop gap with PCI Express until...



    Would look better than the strange cpu grades in the current range. ONLY one new speed grade.



    Antares arrives six months after? 2.8, 3 and 3.2 duals?



    Lemon Bon Bon
  • Reply 209 of 281
    F***** the ultra.



    SLI the GT 6800!!! BTO at Applestore? Wishing.



    Nah, the Ultra is okay. Just need to see it yield.



    Nice to see PC land having supply probs on the 6800 Ultra.



    How's it feel to have shipping probs, eh PC land?



    You like dat?



    Feel good, eh?



    Lemon Bon Bon
  • Reply 210 of 281
    emig647emig647 Posts: 2,455member
    I agree...



    I have a feeling Tiger will finally give us the OGL power we need. Whether cinebench is beta or not doesn't mean that much. We've all seen that our gl scores are slower no matter what the app. OGL code is OGL code. Unless our libraries are completely different? Somehow I doubt it?



    Anyways, I really think our cards will perform better soon. Just gonna take a little time.
  • Reply 211 of 281
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by Lemon Bon Bon

    Beta Cinebench or not. Lagging by over 100% on the same card is not on.



    Not acceptable for the Mac platform where Apple charges you more for the same PC card.



    I know there are factors at play. But over 100% behind? Poor. I know when



    Lemon Bon Bon






    Lemon did you miss this _>>>>> G5 was running Mac OS 10.3.4, which didn't include any specific driver for the 6800 ultra.



    You cant just throw a card in a PC without the drivers for it, and expect it to perform.
  • Reply 212 of 281
    emig647emig647 Posts: 2,455member
    Just because it was running 10.3.4 doesn't mean it didn't have the drivers on it.



    I actually got to use that same EXACT machine at WWDC. It was on the 2nd floor where they were showcasing the new LCDS. It was the left most machine. I even got to talk to the guy that was show casing them. The machine had the drivers... maybe they were beta at that point I'm not sure.



    Anyways... I did get to play Unreal 2004 and close combat on a 30" lcd. This was the dual 2.5 with the 6800 ultra. It performed flawlessly. It was extremely sexy. But again this was games and not intense 3d apps.



    Sadly I wasn't there when they did the Cinebench test on it. But it was THAT machine.
  • Reply 213 of 281
    Quote:

    Anyways... I did get to play Unreal 2004 and close combat on a 30" lcd. This was the dual 2.5 with the 6800 ultra. It performed flawlessly. It was extremely sexy. But again this was games and not intense 3d apps.



    Was it running at 2500x1600 res?







    Or something more modest?



    Still, the fact that you saw a 30inch beast in the flesh has my envy (I'm not jealous, though...)



    Say, did it look really big compared to say...a 20 inch? I'm thinking about getting one when Apple release the Antares beast.



    Quote:

    Lemon did you miss this _>>>>> G5 was running Mac OS 10.3.4, which didn't include any specific driver for the 6800 ultra.



    Yeah. I caught that. It didn't fully register. Apple shouldn't be releasing pish poor drivers for a £450 graphics card...in a £2100 machine which is needed to run a £2500 monitor. Get my drift?



    I think/hope that all this is a moot point.



    Check out www.vr-zone.com (a good tech' news site...)



    Open GL 2.0 has just gone live. But it will need Nvidia and ATI to support it and well, hopefully, Apple will see fit to include it in 'Tiger'.



    This should restore parity with Direct X, M$'s API. (which always looked 'cardboardy' in my view...)



    Hopefully, Apple will be THE GL environment then.



    Apple should be paying more attention to GL if they are making it the 'co-processor' in terms of the offloading of the OS and other APIs like Core Image/Video.



    In their interests to work hard and get it kicking the ass of the x86 equivalent.



    So...next gen' (512 meg cards and SLI I hope...) cards, dual core and 'Tiger'/Open GL 2 rock solid support from Nvid/Ati should add up to one kick-ass 3D machine.



    Who knows, Onlooker may even forgive the lack of 'Pro' card by that point. I know I will. Antares and a next year consumer card should kick-ass.



    I wanna build cities...



    Lemon Bon Bon
  • Reply 214 of 281
    • 1 ) I don't exactly realize why Apple, and IBM do not make the graphics portion of the motherboard "ready to use" with any PC cards by like Gainward, or PNY, and alike. Other than the firmware, and drivers what is the big difference?

    • 2 ) Would it be that hard to get PNY (or anyone) to supply a card with some basic cross-platform functionality, and send the firmware, and drivers on a disc to be installed by the user?

    • 3 ) Now from what I've read. Nvidia GPU's are all Mac, and PC capable, but ATI has some specific differences. (I'm not sure if that is still true) Is the way ATI cards and GPU's are why Apple cannot use standard PC cards?

    No matter what anybody says I still think being able to use PC graphics cards in Mac's would help their image, and their sales.
  • Reply 215 of 281
    emig647emig647 Posts: 2,455member
    I have to agree with you. Personally I think it has something to do with control and or stability. I know for a fact (cause I witnessed this) you can take an ATI 8500 PC card and flash it and it will work in your mac. AGP... etc.



    Do I have the balls to go out and buy a 9800xt to flash it to see if it will work in my mac? NO .



    THOUGH!!! I suppose you could buy a refurbished version from Newegg...



    As far as NVidia, last I remember you could flash a geforce 2mx and use it... I don't know what the deal is now.



    But I agree, it would definitely help sales for macs to be able to do such a thing. It might be a little more difficult with Pro cards though...
  • Reply 216 of 281
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by Lemon Bon Bon

    Was it running at 2500x1600 res?







    Or something more modest?



    Still, the fact that you saw a 30inch beast in the flesh has my envy (I'm not jealous, though...)



    Say, did it look really big compared to say...a 20 inch? I'm thinking about getting one when Apple release the Antares beast.




    It was A LOT bigger. I'm not 100% sure on the resolution, but I'm confident it was lower than the 2500 resolution. The icons weren't that small, though they could have bumped up the size in the finder prefs.



    Personally I can't justify the price diff between the 23" and 30"...but then again I'm not a video guru. Hell I'm still using CRTs... besides them burning my eyes out I love my dual 19" p95f+ viewsonics.



    It was really really cool. i couldn't even describe how cool it was to my friend (the one I was staying with while I was down there).



    But I won't ever forget it
  • Reply 217 of 281
    smalmsmalm Posts: 677member
    Quote:

    Originally posted by french macuser

    http://www.3dfluff.com/mash/cbsingle.php



    It seems the GeForce 6800 and Radeon 9800 are crippled to 4 Pipelines.

    Look at the numbers for the Radeon 9600!
  • Reply 218 of 281
    addisonaddison Posts: 1,185member
    Quote:

    Originally posted by emig647

    But I agree, it would definitely help sales for macs to be able to do such a thing. It might be a little more difficult with Pro cards though...



    I agree 100%. This is an unnecessary barrier to the platform. If it were jusr a driver issue many more cards would be available. The fact that the cards need custom firmware is a nonsense.
  • Reply 219 of 281
    Quote:

    Originally posted by smalM

    It seems the GeForce 6800 and Radeon 9800 are crippled to 4 Pipelines.

    Look at the numbers for the Radeon 9600!






    Are you saying that they have the ability to utilize more pipes but have been artificially limited?
  • Reply 220 of 281
    emig647emig647 Posts: 2,455member
    If the 9800xt and 9600xt and 6800 ultra don't have the correct # of pipes... apple is looking at a HUGE lawsuit. There are certain features that separate the 9x00 xt from the 9x00 pro cards. Thats clock speed and rendering pipes. If you don't have both, you don't have a XT card and that is some serious false advertising. This is why there is a 9800 SE card. Because ATI produced the chips as 9800's but some of the pipes didn't work during testing. So they tried 4 pipes instead of 8 and it works. So they pass it off as a SE Edition. Basically its a 9600.



    I seriously think someone needs to do some more research on this.
Sign In or Register to comment.