What will be the new specs for the next PM line?

1910111315

Comments

  • Reply 241 of 281
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by onlooker

    Wow! You must be either rich, work in 3D, or a thief.



    We have rendering farms.
  • Reply 242 of 281
    telomartelomar Posts: 1,804member
    I posted this elsewhere but since people appear to be debating Apple's video cards I might point out from Xbench results it appears there's a limitation on OGL. Certainly the new iMac can get 140+fps but a dual 2.5 GHz machine with Radeon 9600 XT also only gets ~145 fps. Seems to me something's fishy in Apple's OGL implementation and there's nothing wrong with the cards.
  • Reply 243 of 281
    emig647emig647 Posts: 2,455member
    I have to agree with you..... though I hope you're wrong.



    Apple needs to fix this ASAP before word really gets out there. You heard from Steve Jobs' mouth at the WWDC keynote... "The focus has moved to GPU, because GPU speed increase has blown moores law out of the water...".



    To me a GPU is far more important now days than a CPU.



    I mean hell... If I were to build another machine right now... I'd spend about 350 on a graphics card (6800 GT) and about 200 on a CPU... *shrugs*. Of course I could get much more expensive with the CPU but there isn't any reason to. To me there isn't enough of a performance increase to spend another 500+ on a processor.
  • Reply 244 of 281
    I think your under estimating the usefulness of your CPU. That is what Is what is driving your system. But regardless. Apple needs to get these things straightened out on the double.
  • Reply 245 of 281
    The CPU is still important but I agree with Emig647 that if I was personally building a system I'd go heavy on the GPU rather than the CPU.



    From top to bottom you're probably looking at a CPU delta of about %30 in speed but a AMD/Intel lowend chip is just over a $100 for a decent chip and their top of the line is usually $600+. You'd probably do well to get a $200-300 CPU and a $200-300 GPU. Nice and balanced.
  • Reply 246 of 281
    emig647emig647 Posts: 2,455member
    I didn't give my statement much reasoning...



    In the future most OS's will put a heavy load on the GPU for eye candy and new gui interfaces. Just imagine what you can accomplish with 3D GUI OS... Its almost hard to grasp because its thinking outside the box of the last 20 years. We've been stuck on 2d for so long. Now that everyone (hopefully KDE 3.4 will be use OGL too) is using OGL for the GUI of the OS... things can get really interesting.



    Either way, just looking at Tiger with CoreImage and CoreVideo (is it coreVideo... heh I was there I should know), everything is going to be using the GPU so much in the future. It is going to start carrying more and more of a load as we go on.



    I wouldn't be shocked if in 5 years Mac OS was 3d. Oooooooh could you imagine the possibilities?



    Anyways... yah a CPU is extremely important... I just think its not as important (being top speed) as it used to be. I'll take a mediocre CPU and a high end graphics card any day.
  • Reply 247 of 281
    emig647emig647 Posts: 2,455member
    We should get a POLL going for guesses one when new machines will be released......



    I guess June again... Unless IBM can do some major changes soon.
  • Reply 248 of 281
    emig647, what is the point of having a 3d environment on a 2d screen? I mean it doesn't make much sense. You, me and everyone else here could think of some really cool stuff you could do! Yet for the common user it isn't that big of a deal. Now don't get me wrong, I also see more 3d aspects coming to the GUI, but most people think of a computer like an office, or desk. Most the stuff you do is on paper, and paper is flat.



    The other thing about 3 dimensions is you interact with it. So even if you have a 3d screen it just isn't the same. You can only take it so far, you know.
  • Reply 249 of 281
    onlookeronlooker Posts: 5,252member
    I'd rather not start getting into an argument over CPU vs. GPU, and what is more important, but I have to agree partly with Tasovar, and also say that throwing the theoretical load all on the GPU, and disregarding all importance of the CPU is not the best answer, or solution I've ever heard. It sounds like way to cop out. No matter what Apples next BIg smoke, and mirror keynote is going to demonstrate.
  • Reply 250 of 281
    emig647emig647 Posts: 2,455member
    All I have to say is CoreImage and CoreVideo...



    Now get 5 applications using them at once...



    Hence... importance of GPU. I'm not saying giving up on CPU... but GPU is extremely important now.



    Either way... my point on the OS is... we won't be browsing with 2d forever... there will be 3d OS sooner than later... It will make things easier to navigate... you'll see.



    Either way (2d or 3d) the graphics card is still used a lot in OS stuff... quartz extreme anyone? Lets compare today's OS with an OS from 4 years ago... all you needed was Video ram... now you need rendering done on the fly.... need to apply different textures... etc. M$ is doing this too.
  • Reply 251 of 281
    If you look at the videos of Sun's Looking Glass, you can see one approach to using 3D to create a desktop. I personally am not impressed. I get the same versatility by never maximizing windows and leaving a little corner of each showing so I can select it... and with the F9 key, I don't even need to do that. Don't see much point to window transparency either; just makes things harder to read.



    What I'd like to see is fewer widgets on the windows, so that the screen useful space would be increased. Kind of like the Dashboard gadgets that have the controls on the back.



    BTW, one of my gripes about Windows is how applications can "grab focus" and force themselves into the foreground. You're trying to type into a window, and blam! this other window suddenly becomes the window with focus, and you have to reselect the original window, and just as you start typing again, blam! it happens again. I HATE that. That is as user-unfriendly as computers get. Mac OS X doesn't do that.
  • Reply 252 of 281
    Quote:

    Originally posted by emig647

    All I have to say is CoreImage and CoreVideo...



    Now get 5 applications using them at once...



    Hence... importance of GPU. I'm not saying giving up on CPU... but GPU is extremely important now.



    Either way... my point on the OS is... we won't be browsing with 2d forever... there will be 3d OS sooner than later... It will make things easier to navigate... you'll see.



    Either way (2d or 3d) the graphics card is still used a lot in OS stuff... quartz extreme anyone? Lets compare today's OS with an OS from 4 years ago... all you needed was Video ram... now you need rendering done on the fly.... need to apply different textures... etc. M$ is doing this too.




    Yes but, core image, core video, quarts extreme, your starting to sound like the smoke, and mirrors keynote I was talking about. If you start leveraging the GPU for everything it can use - it still doesn't compromise the importance of the CPU. The CPU still runs everything else, and is still the big ## cruncher that makes super-computing what it is. I don't think supercomputers even have graphics cards for actual data reconstruction of simulations. Unless its just for the display. Don't get too caught up in this dream that the GPU will bail the Mac out of trouble. The GPU shouldn't need to.



    Plus no matter how fast it is now in 20 years we are going to be looking at these processors like they were some kind of joke. It's never going to be fast enough, and that is why you don't let up on the importance of the CPU. It's not something that can just be abandoned.
  • Reply 253 of 281
    emig647emig647 Posts: 2,455member
    You're putting words into my mouth onlooker. You need a balanced system. That means everything from CPU to RAM to GPU to Storage to Bus.



    CPU is extremely important. But my point is GPU is growing in popularity... damn onlooker... I thought you'd agree with me that the GPU is more important now days... I guess anything I say you just like to disagree with... WTF?



    I'm not saying give up on CPU... but I am saying put some more importance on GPU to take a little stress off of the CPU. CPU doesn't need to handle EVERYTHING... including graphics processing.



    You're right super computers really don't have graphics cards... but thats not their focus. Only a few computers in a cluster need to have graphics cards... just for the OS... but most of the computers in a cluster don't need to have a user browse their OS via GUI. Terminal all the way.



    The 3d browsing would be for us. Anything 3d will be for "us"... the only other thing that graphics cards will be used for when it comes to serious computing is rendering farms.... 3d and video...



    CPU needs to be focused on more than it is now... but GPU definitely needs some damn growth... what we have now isn't acceptable... maybe we have the hardware... maybe its the software that needs work... daily it becomes more apparent that its pointing to software... anyways... i'm drunk... ttyl ONLOOKER.
  • Reply 254 of 281
    What's the point?



    Because you can?



    AMD chip. 3.5 gig. GPU Nvidia GT 6800.



    Both cost about the same.



    That makes a balanced system.



    You get almost nothing in extra performance by going for the 3.8 at a 100% cost premium with a 3% performance up on the cpu.



    Point is, for any gpu these days...we're probably cpu bound at the moment. So 3-3.8 gig cpu will do for the latest gpus judging from the reviews.



    The GPU is carrying most of the load. It's at the point where a wider set of cpu grades will do the job.



    OS, Games, graphics 3D...it's at the point now where gpu is heavily emphasized.



    Think of it as the 'co-processor'. Like those graphic chips on the old Amigas.



    The cpu is still important but becoming perhaps the glorified message boy of the modern computer.



    When it comes to graphics? The gpu is becoming king in a big way. And that's the way the wind is blowing.



    I don't think it is going to matter what cpu grade you have at a given point. Until dual core comes to push the gpu harder again.



    With dual core and SLI gpu? Wow. Think of the power supply needs of this little 'balanced' system.



    3D on the desktop. Inevitable in my own mind. We'll see. Just look at the drop shadows on 'X'. 'Relief' graphics on buttons. It's kinda subtle here in a 2D+ kind of way. Don't think Apple won't do it. I can happily see a document scrunched up in superfluous 3d and landing in a wobbling trashcan all in glor'fied 3d. Or a 'warp 9' style star streak effect as you launch Safari version 7. Apple adopted GL...after the debacle of Quickdraw3d. So don't think they aren't aware of the importance of 3D. (Ultra crapics in the iMac aside...)



    Almost everything game is in 3D now. Personally, I think the best games ever where in 2D on the c64. Just me. I still think game makers are grappling with 3D. What angle to show action. Camera. Lighting. Size relativity. erhh. Realism. Refinement. An ongoing battle. But games like Doom 3 and Half Life 2 and Stalker...show these hurdles are being flown over...



    On the Mac, the cpu still aint fast enough yet. The low end Mac cpus aren't fast enough to do justice to many of the gpus in the maintream PC market. Add to that Apple's 'solid' implementation of GL and, well, you've got problems. 'Good enough' for non-demanding games and a few OS 'eye-candy' parlour tricks perhaps. I'd like to see Apple's cpu line hit 3 gig as soon as possible. Less worried about the 4 gig mark on the Wintel side. Clearly, Intel cpus seem to have flatlined on the performance return. AMD too. Looking at the benches...the difference on rendering say Max, Lightwave, Cinema et all seems to be in seconds. Hardly worth the extra £200 at all. One thing stepping up from 1.6 gig to 2.5 but another entirely going from 3.5-3.8.



    I noticed on the 2.5 G5 rendering benches vs the Xeons earlier in this thread? The G5 seemed to be giving up 500mhz per chip. So, clearly a 3 gig G5, say nothing of a dual core 3 gig G5, will wipe the floor with Intel/AMD chips if IBM can get them out before AMD/Wintel get their stuff out the gate. And I hope Apple put their GL issues to bed with Tiger and GL 2.



    GPU processing is here. It's here to stay. Expect to see more of it. Soon.



    ...and back to the thread. Expect ANTARES, a 6800 Ultra with 512 megs of ram to get us there much sooner.



    Lemon Bon Bon
  • Reply 255 of 281
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by Lemon Bon Bon

    You get almost nothing in extra performance by going for the 3.8 at a 100% cost premium with a 3% performance up on the cpu.



    YEP



    Quote:

    The GPU is carrying most of the load. It's at the point where a wider set of cpu grades will do the job.



    OS, Games, graphics 3D...it's at the point now where gpu is heavily emphasized.



    Think of it as the 'co-processor'. Like those graphic chips on the old Amigas. The cpu is still important but becoming perhaps the glorified message boy of the modern computer.





    EXACTLY



    Quote:



    3D on the desktop. Inevitable in my own mind. We'll see. Just look at the drop shadows on 'X'. 'Relief' graphics on buttons. It's kinda subtle here in a 2D+ kind of way. Don't think Apple won't do it. I can happily see a document scrunched up in superfluous 3d and landing in a wobbling trashcan all in glor'fied 3d. Or a 'warp 9' style star streak effect as you launch Safari version 7. Apple adopted GL...after the debacle of Quickdraw3d. So don't think they aren't aware of the importance of 3D. (Ultra crapics in the iMac aside...)




    Onlooker will disagree with you and call this useless eye candy. But lets not stop there... the possibilities are unlimiting. Check this idea out. Lets say you have a folder with 5 text files. The folder is transparent and you can see a preview of EACH FILE in the folder before you even open it. Ok ok... what about folders with 10000 files... then you show a preview of the 5 most recently accessed files. *shrugs* OOHHHHHH how about file cabinent... a draw opens up with all of your files.... top of the folders are marked... with the damn files (paper) in them... so they are almost non-existant!!! that's a damn good idea and I don't care what anyone says. Much better than looking at a detailed list of 10000000000 files in a damn folder. Thank you thank you I'll be here all week. Watch we'll see that in 5 years.



    Quote:

    I noticed on the 2.5 G5 rendering benches vs the Xeons earlier in this thread? The G5 seemed to be giving up 500mhz per chip. So, clearly a 3 gig G5, say nothing of a dual core 3 gig G5, will wipe the floor with Intel/AMD chips if IBM can get them out before AMD/Wintel get their stuff out the gate. And I hope Apple put their GL issues to bed with Tiger and GL 2.



    GPU processing is here. It's here to stay. Expect to see more of it. Soon.



    ...and back to the thread. Expect ANTARES, a 6800 Ultra with 512 megs of ram to get us there much sooner.



    Lemon Bon Bon




    Personally I don't see a whole lot of performance gain going from 2.5 to 3.0... if it is the 970fx they get to 3ghz with. There isn't a huge gain from 2.0 -> 2.5 (20-22.5 performance gain according to apple internal tests (no I can't back it up without really screwing over some peeps @ apple)).



    I do hope apple gets to 3.0+ though. Just to say we made it over that hump in the race. But keep in mind everyone... AMD hasn't made it there yet either. AMD is top in speed though. The 250 and 150 are great processors! But IBM can beat them!! Once IBM / freescale is matched in performance... and they move to dual-core ... um wow!?



    My prediction is this for top of the line (I refuse to comment on mid and low because apple doesn't know until the high end is decided on):



    dual 3.0+ GHZ w/ 1mb l2 CACHE (anything lower won't be acceptable by jobs unless the update cycle is in jan).

    1.5ghz FSB

    6800 ultra

    512mb RAM PC3500 (at least).

    200GB SATA

    100-200 dollar price reduction YAYYYY.



    I hope Tiger isn't a disappointment to most people. I hope apple realises their OGL performance... I can't imagine ALIAS not letting them know!
  • Reply 256 of 281
    emig647emig647 Posts: 2,455member
    Another thing I wanted to mention last night in my drunk'st rage...



    Onlooker: you need to watch the WWDC keynote again... You have no idea how insane CoreImage and CoreVideo are!!



    I mean CoreImage does what photoshop does with filters... ONLY IN REAL TIME!! No more waiting to render a glass filter for 5 seconds... CoreVideo... same thing. Apps are going to use this more and more... from illustrator -> photoshop -> Motion -> FCP -> who knows wtf you can do with the possibilities. I will GLADLY send you my WWDC slides if you don't believe me.



    This new technology opened up some HUGE doors to what can be done with computers.
  • Reply 257 of 281
    Quote:

    Personally I don't see a whole lot of performance gain going from 2.5 to 3.0... if it is the 970fx they get to 3ghz with. There isn't a huge gain from 2.0 -> 2.5 (20-22.5 performance gain according to apple internal tests (no I can't back it up without really screwing over some peeps @ apple)).



    I do hope apple gets to 3.0+ though. Just to say we made it over that hump in the race. But keep in mind everyone... AMD hasn't made it there yet either. AMD is top in speed though. The 250 and 150 are great processors! But IBM can beat them!! Once IBM / freescale is matched in performance... and they move to dual-core ... um wow!?





    Antares isn't just about '3 gig +', it's also about 'dual core'.



    So we'll hopefully see more than '22%'.







    Quote:

    mean CoreImage does what photoshop does with filters... ONLY IN REAL TIME!! No more waiting to render a glass filter for 5 seconds... CoreVideo... same thing. Apps are going to use this more and more... from illustrator -> photoshop -> Motion -> FCP -> who knows wtf you can do with the possibilities.



    Seeing as Apple did 'Funhouse' in a week with one programmer...I will damn Adobe to eternity if they don't include 'Core Image' in PS9.



    If it doesn't make it..., 'politics'.



    I'd love to see Apple buy 'Painter' and marry it will 'Core Image' 'Funhouse'. You'd be on the way to a serious PS killer.



    Lemon Bon Bon
  • Reply 258 of 281
    emig647emig647 Posts: 2,455member
    Quote:

    Originally posted by Lemon Bon Bon

    Antares isn't just about '3 gig +', it's also about 'dual core'.



    So we'll hopefully see more than '22%'.





    I was saying if they stuck with 970fx... I hope they move to dual core next round.... I'll be a happy little boy.



    Quote:



    Seeing as Apple did 'Funhouse' in a week with one programmer...I will damn Adobe to eternity if they don't include 'Core Image' in PS9.



    If it doesn't make it..., 'politics'.



    I'd love to see Apple buy 'Painter' and marry it will 'Core Image' 'Funhouse'. You'd be on the way to a serious PS killer.



    Lemon Bon Bon




    Damn where is the drool face
  • Reply 259 of 281
    onlookeronlooker Posts: 5,252member
    Quote:

    Originally posted by emig647

    You're putting words into my mouth onlooker. You need a balanced system. That means everything from CPU to RAM to GPU to Storage to Bus.



    CPU is extremely important. But my point is GPU is growing in popularity... damn onlooker... I thought you'd agree with me that the GPU is more important now days... I guess anything I say you just like to disagree with... WTF?



    I'm not saying give up on CPU... but I am saying put some more importance on GPU to take a little stress off of the CPU. CPU doesn't need to handle EVERYTHING... including graphics processing.



    You're right super computers really don't have graphics cards... but thats not their focus. Only a few computers in a cluster need to have graphics cards... just for the OS... but most of the computers in a cluster don't need to have a user browse their OS via GUI. Terminal all the way.



    The 3d browsing would be for us. Anything 3d will be for "us"... the only other thing that graphics cards will be used for when it comes to serious computing is rendering farms.... 3d and video...



    CPU needs to be focused on more than it is now... but GPU definitely needs some damn growth... what we have now isn't acceptable... maybe we have the hardware... maybe its the software that needs work... daily it becomes more apparent that its pointing to software... anyways... i'm drunk... ttyl ONLOOKER.




    I only read this post so far so if I miss something I apologize right away.



    I was miss reading into what you were saying. But, I may have not been clear as to what I'm getting at. What I was getting at but never said is - it's the whole damn graphics system on the Mac right now that is failing. If all apple does is merely optimize the way the GPU is used without doing all the other things like getting the drivers to use all the pipes, and all the pre noted faults fixed before utilizing the GPU I still consider it a major failure. Especially for what Apple charges for a PowerMac. That top PowerMac should be the freak-in' bomb.



    So to sum that up. If Apple were to take what they have "now" , and run everything through the GPU, using quarts extreme, core image, and core video. It would be a smoke, and mirror show. I should have been more clear about that from the get go, but I was also frustrated with the CPU's performance, and got off track.
  • Reply 260 of 281
    emig647emig647 Posts: 2,455member
    Yes they definitely need to fix the GPU stuff... I was kind of hinting at that with "tiger" sentence.



    They need to fix it before they start getting over their heads. In my opinion their software and ideas is far superior than any hardware can handle right now. I have a feeling since CoreImage and CoreVideo... they will have no choice but to optimize OGL on tiger. When I get my 2nd harddrive I'm going to use Cinebench on the WWDC Preview of tiger... see if its any faster. Wouldn't that be nice?



    I should have my hands on a newer copy soon... we'll see if there is any improvements.
Sign In or Register to comment.