Apple intros new Mac Pro with "Nehalem" Xeon processors

1181921232426

Comments

  • Reply 401 of 506
    melgrossmelgross Posts: 33,580member
    Some tests here show why these machine ARE worth their price tags in performance, esp if you need high performance in many pro video editing apps etc., when these apps are properly written as many of them are. Note the performance of the 2.26 8 core model as compared to the 3.2 GHz and 2.8 GHz older 8 core models.



    For pro video editing, an 8 core 2.26 GHz model is better in most tests than the 8 core older 3.2 GHz model.



    http://www.barefeats.com/nehal03.html



    We should be seeing a lot more tests shortly.
  • Reply 402 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by irondoll View Post


    Saw this question over on macrumors and it seemed pertinent here, perhaps someone knows the answer:







    Sounds like some problem on the 2.26ghz nehalems to me...



    Either the testing contains errors?not the first time this has happened, or the amount of memory was very different, though memory is usually less important for video editing, when 8 cores are being used, that can change. The 2.26 is no different from the rest other than the speed it's running at.



    The last is whether the numbers supplied in that post from MR are correct in themselves.
  • Reply 403 of 506
    Quote:
    Originally Posted by melgross View Post


    Some tests here show why these machine ARE worth their price tags in performance, esp if you need high performance in many pro video editing apps etc., when these apps are properly written as many of them are. Note the performance of the 2.26 8 core model as compared to the 3.2 GHz and 2.8 GHz older 8 core models.



    For pro video editing, an 8 core 2.26 GHz model is better in most tests than the 8 core older 3.2 GHz model.



    http://www.barefeats.com/nehal03.html



    We should be seeing a lot more tests shortly.



    Well, that barefeats page is recycling the same cinebench (which is a 3D not video test) data from the macrumors thread, I'm curious why he didn't show the single threaded graphs (it would reveal the very poor 2.26 point). I think we have to assume that cinebench data is wrong in some way considering the spurious result og the lowend octocore nehalem (note, it is beaten by a Macbook Pro 2.4 core 2 duo!!!!):



    http://tesselator.gpmod.com/Images/_...10_Numbers.jpg



    I suppose we can break down the geekbench numbers for single threaded tests and do a similar ghz normalisation to see if the 2.26 is still abnormally under-performing there too...



    EDIT: perhaps it is differences in graphics cards, as cinebench I assume taxes OpenGL too. We do know that the previous generation 8800GT should stomp on the new Nvidia 120, which could explain the harpertown dominance over the 2.26 nehalem. Still doesn't explain why it is beaten by a laptop, how can the 9600M be faster than the NVidia 120?
  • Reply 404 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by irondoll View Post


    Well, that barefeats page is recycling the same cinebench (which is a 3D not video test) data from the macrumors thread, I'm curious why he didn't show the single threaded graphs (it would reveal the very poor 2.26 point). I think we have to assume that cinebench data is wrong in some way considering the spurious result og the lowend octocore nehalem (note, it is beaten by a Macbook Pro 2.4 core 2 duo!!!!):



    http://tesselator.gpmod.com/Images/_...10_Numbers.jpg



    I suppose we can break down the geekbench numbers for single threaded tests and do a similar ghz normalisation to see if the 2.26 is still abnormally under-performing there too...



    They state very clearly where they have gotten the information from, but that doesn't make it any less valid than anything that has been posted here by others.



    What you have to remember is that single threaded applications are becoming rarer as time goes by. Two years from now, when these machines will still be very viable, it will be difficult to find a single threaded app.



    I don't know of a single pro video editing app, including Windows apps, that is not efficiently multithreaded. The same thing is true of pro 3D apps.



    I think the concern about single threaded performance affects the GUI more than anything else. but 10.6 should make that less important as well.



    The focus on single threaded apps is misplaced. Even AMD has gone to favoring multithreading in its CPUs.
  • Reply 405 of 506
    Oh I agree about betting the house on multithreading for the future (assuming the non-linear cinebench result with the 2.26 is an anomoly I'd go for an octocore 2.26 over a quadcore 2.66), though I think it will take longer than we idealistically hope; often it is a "hard-problem" to divide a task into useful parts (many filters in PS work on the whole image, you cannot just tile the image into N tiles and spit each to a thread). Snow leopard can help in places, though it cannot perform magic on "hard-problem" tasks.



    I am annoyed that Apple didn't take advantage of "multithreaded" (technically prarallel computation) graphics for OpenCL in the Mac Pro. I would have thought enabling the support for SLI/crossfire in the reference designs from Intel, and a dual-GPU card by default would have allowed stellar Snow Leopard performance values when it is announced soonish. We bet our house on multithreading, but still fail to parallelise GPU computation, old technology available in cheap PC architectures...
  • Reply 406 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by irondoll View Post


    Oh I agree about betting the house on multithreading for the future (assuming the non-linear cinebench result with the 2.26 is an anomoly I'd go for an octocore 2.26 over a quadcore 2.66), though I think it will take longer than we idealistically hope; often it is a "hard-problem" to divide a task into useful parts (many filters in PS work on the whole image, you cannot just tile the image into N tiles and spit each to a thread). Snow leopard can help in places, though it cannot perform magic on "hard-problem" tasks.



    I am annoyed that Apple didn't take advantage of "multithreaded" (technically prarallel computation) graphics for OpenCL in the Mac Pro. I would have thought enabling the support for SLI/crossfire in the reference designs from Intel, and a dual-GPU card by default would have allowed stellar Snow Leopard performance values when it is announced soonish. We bet our house on multithreading, but still fail to parallelise GPU computation, old technology available in cheap PC architectures...



    Adobe is working on the very problem you mentioned. Thee are ways of doing it.



    The new Mac Pro, unlike the ones before, has two 16 live lane slots. The second one isn't a double space slot though. Who knows what is possible with that?



    I would have liked to see the 4870x2. I would have gone for it.



    ATI's newest pro boards are coming with Displayport, so it's possible that we will see one of them before too long.



    Nvidia has a ($2,000) board that's tuned for Photoshop. I'd like to see that here as well.
  • Reply 407 of 506
    Quote:
    Originally Posted by melgross View Post


    I would have liked to see the 4870x2. I would have gone for it.



    ATI's newest pro boards are coming with Displayport, so it's possible that we will see one of them before too long.



    Nvidia has a ($2,000) board that's tuned for Photoshop. I'd like to see that here as well.



    Indeed I'm keeping my fingers crossed that there is nothing on the motherboard which stops multi-GPU cards for a future update. I'd love to see a 4870X2 board (and Quadro FX and Adobe friendly Quadro CX) announced when snow leopard comes, and a CS5 preview showing 64bit PS with OpenCL support. Oh and Final Cut Studio finally dragged out of the carbon age, it is embarassing that Apple's own Pro apps are languishing in 32bit carbon antiquity...



    Lets dream on!
  • Reply 408 of 506
    messiahmessiah Posts: 1,689member
    Quote:
    Originally Posted by melgross View Post


    If they left the second socket, the machine would have cost even more.



    I'm not so sure how much more it would have cost.



    At the moment they've had to pay for the development and testing of two separate daughter boards (or two variations of a single board), not to mention the costs of setting up two different production lines (tooling etc.). There are then the on-going costs of maintaining two separate components and the logistics of how many of each need to be manufactured. Somebody somewhere has to monitor and manage two separate parts, and co-ordinate which go where and when. Factor in that the two separate daughter cards can't be cross purposed if required, and you start to see why one of the Golden Rules taught to product designers is 'Commonality'.



    Commonality teaches us that it's often more cost effective in the long term to manufacture a single component that is marginally more expensive, than manufacturing and managing multiple lower cost devices.



    It's entirely likely that at some point or another Apple are going to have thousands of dual-socket daughter cards sitting on the shelf, and have to order more single-socket boards when they could have just used a common part (or vice-versa). Extrapolate this out to the worldwide spare parts infrastructure and there is a very compelling argument for the adoption of Commonality from the get-go.



    For the sake of the 'cost' of a second processor socket from Foxconn, Apple has sure made a rod for it's own back. Then there's the cost of the single-processor heatsink, which is a different design to the dual-processor heatsinks ? again Commonality would suggest that it was simpler and more cost effective to develop a single part that would function either way.



    No, I believe that if there are any cost benefits, they will be extremely thin or indeed non-existent once everything is factored in. I believe that Apple has introduced the two-tier approach to ensure that if you buy cheap ? you stay cheap.



    In virtually every other product design the world over, where a consumer isn't expected to upgrade internal components after the fact, you'll find that the manufacturers will design a common chassis that can accommodate multiple parts. This is true whether you are designing the engine bay for a car or the frame of a mountain bike. The only reason to go out of your way and incur the extra costs of developing unnecessary variations is if you want to ensure that customers aren't able to upgrade certain components after the fact ? thereby forcing them into paying a premium for the product that hasn't been artificially limited.



    Just my thoughts!
  • Reply 409 of 506
    haggarhaggar Posts: 1,568member
    It looks like Abster2Core better get out his drill bits because the memory and processor board is mounted on a metal tray. And according to Abster2Core, putting a board on a metal tray is a pain and you need drill bits in order to replace the board. So I would expect him to dismiss and criticize the new tray design of the Mac Pro, just like he dismisses and criticizes the idea of motherboard trays for PCs. But I guess those comments only apply to computers that are not Macs.
  • Reply 410 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by Messiah View Post


    I'm not so sure how much more it would have cost.



    At the moment they've had to pay for the development and testing of two separate daughter boards (or two variations of a single board), not to mention the costs of setting up two different production lines (tooling etc.). There are then the on-going costs of maintaining two separate components and the logistics of how many of each need to be manufactured. Somebody somewhere has to monitor and manage two separate parts, and co-ordinate which go where and when. Factor in that the two separate daughter cards can't be cross purposed if required, and you start to see why one of the Golden Rules taught to product designers is 'Commonality'.



    Commonality teaches us that it's often more cost effective in the long term to manufacture a single component that is marginally more expensive, than manufacturing and managing multiple lower cost devices.



    It's entirely likely that at some point or another Apple are going to have thousands of dual-socket daughter cards sitting on the shelf, and have to order more single-socket boards when they could have just used a common part (or vice-versa). Extrapolate this out to the worldwide spare parts infrastructure and there is a very compelling argument for the adoption of Commonality from the get-go.



    For the sake of the 'cost' of a second processor socket from Foxconn, Apple has sure made a rod for it's own back. Then there's the cost of the single-processor heatsink, which is a different design to the dual-processor heatsinks ? again Commonality would suggest that it was simpler and more cost effective to develop a single part that would function either way.



    No, I believe that if there are any cost benefits, they will be extremely thin or indeed non-existent once everything is factored in. I believe that Apple has introduced the two-tier approach to ensure that if you buy cheap ? you stay cheap.



    In virtually every other product design the world over, where a consumer isn't expected to upgrade internal components after the fact, you'll find that the manufacturers will design a common chassis that can accommodate multiple parts. This is true whether you are designing the engine bay for a car or the frame of a mountain bike. The only reason to go out of your way and incur the extra costs of developing unnecessary variations is if you want to ensure that customers aren't able to upgrade certain components after the fact ? thereby forcing them into paying a premium for the product that hasn't been artificially limited.



    Just my thoughts!



    I understand the issues. But it would still have cost more. how much is a different question. But as the price is already high by some people's reckoning, and Apple's boards are always expensive, even a bit more might seem too much.



    I'm not impressed by the argument that those machines need 32 GB RAM. Entities buying that machine have more modest needs to begin with, and it's not likely they would be purchasing $5,000 memory for a $2,500 computer. I wouldn't be surprised if it did take 16 Gb though, which is plenty, and far cheaper.
  • Reply 411 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by Haggar View Post


    It looks like Abster2Core better get out his drill bits because the memory and processor board is mounted on a metal tray. And according to Abster2Core, putting a board on a metal tray is a pain and you need drill bits in order to replace the board. So I would expect him to dismiss and criticize the new tray design of the Mac Pro, just like he dismisses and criticizes the idea of motherboard trays for PCs. But I guess those comments only apply to computers that are not Macs.



    Drill bits? Screwdrivers, maybe. I'll find out Friday.
  • Reply 412 of 506
    messiahmessiah Posts: 1,689member
    Quote:
    Originally Posted by melgross View Post


    I'm not impressed by the argument that those machines need 32 GB RAM. Entities buying that machine have more modest needs to begin with, and it's not likely they would be purchasing $5,000 memory for a $2,500 computer. I wouldn't be surprised if it did take 16 Gb though, which is plenty, and far cheaper.



    To my mind, expansion is all about future proofing, rather than how much you can stuff in it on day one. My take on it is that if you have to stuff it to the gills on day one you've bought the wrong machine. This is especially true of a professional machine.



    If I want to retain the option of 8GB+ it will cost me £2,499 today, whereas with the previous generation Mac Pro it would only have cost £1,749. That's a £750 premium to retain that option.
  • Reply 413 of 506
    Just to note that the cinebench numbers for the 2.26ghz Nehalem were updated, and it is now inline with the other Nehalem processors when correcting for frequency:



    http://tesselator.gpmod.com/Images/_...10_Numbers.jpg



    (you may need to reload this image if you already saw it elsewhere). We now have:



    4074/2.93 = 1390 per ghz

    3572/2.66 = 1343 per ghz

    3142/2.26 = 1390 per ghz



    The multithreaded test is up to 20,138 clearly outpacing the 2.8 harpertowns...
  • Reply 414 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by Messiah View Post


    To my mind, expansion is all about future proofing, rather than how much you can stuff in it on day one. My take on it is that if you have to stuff it to the gills on day one you've bought the wrong machine. This is especially true of a professional machine.



    If I want to retain the option of 8GB+ it will cost me £2,499 today, whereas with the previous generation Mac Pro it would only have cost £1,749. That's a £750 premium to retain that option.



    I've not noticed "real" workstations coming down in price the past few years, if anything they are all increasing.



    This reminds me of when people ask me which camera to buy. It's almost always a Canon or Nikon. Now I use Canons, so have nothing against them, and Nikon is a fine camera as well. But that's not why people ask about them it turns out. It's often because both have dozens of lenses available.



    Considering that most people asking are amateurs, who will never but more than two or three cheaper lenses from any camera company, or third party lens makers, it's odd that they should care how many are available from the maker of the camera they want. but, there's that "just in case" about the $20,000 tele that only Canon or Nikon makes to special order.



    I tell them to forget it. Buy a camera that actually meets their needs, not some hypothetical desire that they will never need, or fulfill.



    The same thing is true here.



    It would be better to buy the dual cpu machinet, and then fill it with cheaper RAM to the 16 GB limit, if the single core really is limited 8 GB (which we still don't know). Even if it's not limited to 2 GM DIMMS, the cost would be prohibitive right now.



    It will be much cheaper that way, and will perform much better, and you could do it more quickly.



    So no matter how you look at it, wanting to fill a single cpu machine with 32 GB RAM is pointless, and even 16 GB is very expensive.
  • Reply 415 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by irondoll View Post


    Just to note that the cinebench numbers for the 2.26ghz Nehalem were updated, and it is now inline with the other Nehalem processors when correcting for frequency:



    http://tesselator.gpmod.com/Images/_...10_Numbers.jpg



    (you may need to reload this image if you already saw it elsewhere). We now have:



    4074/2.93 = 1390 per ghz

    3572/2.66 = 1343 per ghz

    3142/2.26 = 1390 per ghz



    The multithreaded test is up to 20,138 clearly outpacing the 2.8 harpertowns...



    Good you caught that.



    I said that the numbers should be wrong.
  • Reply 416 of 506
    Quote:

    I would have liked to see the 4870x2. I would have gone for it.



    Would you?



    I'm sure you can think of a perfectly valid reason why Apple shouldn't include such a beast in their overpriced 'workstation'?



    I can. But then, I love being an Apple apologist.



    Lemon Bon Bon.
  • Reply 417 of 506
    Still annoyed at the price hike.



    Still annoyed at the lack of high end GPU.



    Still annoyed Apple won't use consumer tower cpus.



    Still annoyed Apple don't offer consumer quad cpus.



    Still annoyed...well. I think you get the idea.



    Lemon Bon Bon.



    PS. 'I know Lemon Bon Bon, you ask...but which one are you buying...?'



    Uhm. Dunno. Still gestating. I have a team of medics still pulling me out of the coma. Y'know...the price hikes...(*Stares...catatonic...)
  • Reply 418 of 506
    melgrossmelgross Posts: 33,580member
    Quote:
    Originally Posted by Lemon Bon Bon. View Post


    Would you?



    I'm sure you can think of a perfectly valid reason why Apple shouldn't include such a beast in their overpriced 'workstation'?



    I can. But then, I love being an Apple apologist.



    Lemon Bon Bon.



    Oh come on! You are really getting sillier by the day.



    You KNOW I'm not an apologist.



    So, yes, Apple should have included it. I've been stating for several years now, since third parties aren't offering boards, that Apple should make some effort to give us a wider selection.



    Even if it means that they have to PAY board makers to do so!
  • Reply 419 of 506
    e1618978e1618978 Posts: 6,075member
    Quote:
    Originally Posted by Lemon Bon Bon. View Post


    Still annoyed at the lack of high end GPU.



    What GPU did you want? I thought that the ATI one was the fastest one out right now (based on toms hardware reviews).
  • Reply 420 of 506
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by e1618978 View Post


    What GPU did you want? I thought that the ATI one was the fastest one out right now (based on toms hardware reviews).



    GTX295
Sign In or Register to comment.