Benchmarks of 2009 iMacs, Mac minis show negligible speed-ups

17891012

Comments

  • Reply 221 of 246
    tenobelltenobell Posts: 7,014member
    Quote:
    Originally Posted by nvidia2008 View Post


    The thing is, the iMac has always been the most important and highest selling Mac... until about last year or a bit before that, wherever the turning point was that laptops eclipsed desktops and the Macbook usurped the legendary iMac.



    Apple's notebook sales really began to dramatically outpace its desktop sales in 2006.



    Quote:

    Clearly the Apple iMac requires a revolution in and of itself, with regard to Apple redefining the desktop computer. Evolutionary progress on Mac Mini and iMac, well, I guess a safe bet in these times. US unemployment at 8% currently, IIRC.



    Unless Apple sticks a rechargeable battery and a handle on the iMac, there's nothing revolutionary to be done to change its future direction.
  • Reply 222 of 246
    melgrossmelgross Posts: 33,600member
    Quote:
    Originally Posted by vinea View Post


    Really? Pray tell what you're doing that requires the 7900GTX? Which is completely outclassed by the 9600 GT barely two years later?



    I use Archicad, among other, such as Motion with FCS. I imagine that PS ill also get a boost after 10.6 comes out.



    Quote:

    Well, The Gulftowns (desktop chip, not Xeon) will still be LGA 1366 but Intel still isn't sure if it will work with existing X58 boards. That uncertainty carries to the Xeons. intels' 6 core Dunningtons are Caneland processors, not Seaburgs.



    The Gulftown is a server, desktop chip, and it uses the same LGA-1366 socket. It will work, at least according to Anandtech, who usually gets these thing right.



    Quote:

    Beckton is the Nehalem 8 core and I dunno that it will be compatible with the Gainstowns (doubtful). It seems likely that Intel might do a 6 core Nehalem Xeon even if it hasn't announced it yet. But there's no certainty that they will be clocked very high. The Dunningtons top at 2.66Ghz despite having 3.4Ghz Harpertowns and 2.93 Tigertons.



    The Becton is not a drop-in part, as it uses the LGA-1567 socket. Apple would need a new mobo for that. If Apple is still intending to bring the Mac Pro into higher territory, they may go that way next year, but I don't think so.



    Quote:

    Given the roadmap has Gainstown at 3.2Ghz at the top end, I wouldn't expect it to be much more than a Dunnington-like Nehalem...that may be destined for 4-way and up servers and not workstations like the Mac Pro.



    If you're still talking about Becton here, I agree, though the highest speeds are malleable right now, as Intel's been making excellent progress in ramping up. It's happened much more quickly than any new architecture in recent history, with fewer tapeouts. It's thought that we may see 3.4 GHz.



    Quote:

    So the odds of you upgrading to 6 core 3.3Ghz chips in your current Mac Pro seems to be 50-50 at best. Either it will be not much faster than 2.66Ghz or it will be for the other Xeon line or both like the Dunningtons...which mostly run in blade servers and other high density applications.



    The odds are actually rather excellent.



    Quote:

    And in any case it will likely cost you $1600+ per chip on the retail market. Which is what the 3.4 X5492s cost today. You're seriously going to drop $3200 for that upgrade when you can get a new Mac Pro instead?



    We'll see about the prices, which haven't been set yet. but, I saved $1,200 by sticking with the 2.66 for now, so it's not that much of a stretch for me. If spending another $3,200 rather than another $5,000+ is the choice, you bet!



    Quote:

    Even if the top end Mac Pro is $6K who on earth thinks that's a good deal? God help you if you manage to munge up the install and break something in the process. You now have a 5K+ doorstop or a $1600 piece of costume jewelry.



    A good deal is in the wallet of the buyer. I like these machines very much. As I'm retired, I didn't buy a new machine every three years, and updated in between as I always used to do. But now I've decided to go for it, as I've been waiting for these chips. I'm also doing work I haven't done for a few years.



    I've converted a number of Mac Pros to 4 core, and never had a problem, though it's a pain. I know of others who have done it as well. These new machines look MUCH easier to work on.



    Quote:

    That's not the point. That Apple makes it hard to do the buy cheaper and upgrade more often is an issue but the general strategy is sound.



    That's exactly the point, because I'm not talking about buying a cheaper machine, I'm talking about the difference between buying a low end version of the same model, vs the higher end version.



    So, for me, the difference would be between buying the single core 2.26 Mac Pro, or the dual 2.66 mac Pro, which is what I did order.



    It's NEVER a question of going for something else.



    Quote:

    That's the CURRENT upgrade. If not this then what did you many friends upgrade from and what did they upgrade to?



    The way you wrote that, it seemed as though you were saying something a bit different. For people who bought an earlier model of the Mac Pro, with the older, slower, 2 core chips, they could upgrade to the faster 4 core chips, whichever number they may have. That's quite an upgrade.
  • Reply 223 of 246
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by TenoBell View Post


    Apple's notebook sales really began to dramatically outpace its desktop sales in 2006.



    Which means maybe they saw the first iMac Aluminium as the last true Boom! of the prominent iMac line.



    Quote:
    Originally Posted by TenoBell View Post


    Unless Apple sticks a rechargeable battery and a handle on the iMac, there's nothing revolutionary to be done to change its future direction.



    Well, that's the kind of innovation Apple should be producing. That's the kind of innovation we expect. An all-in-one that has a large screen (20"+) but is also somehow foldable and portable. How about the "desktop" is actually a unit that sits wirelessly between your TV and a screen which you can shift about the house. I mean, these are crazy ideas but Apple should be finding something that works and they have the potential to fully revolutionize the desktop computer.



    What do we want from a "desktop". What *is* a desktop? What is a computer? What is a screen? Why would I not want a laptop? This is the kind of R&D that I'm sure is going on but this 2009 I wonder if we will see the results of such R&D... or maybe R&D at Apple has shifted to other areas.



    Quote:
    Originally Posted by melgross View Post


    You can be sure that Apple knows of every development its partners are undergoing well before we hear of it.



    Why do you think they went from the PPC when the G5 was running much cooler than the Prescott, and IBM was moving it up in speed faster than the Prescott was being ramped up?



    Remember how shocked we were when Jobs showed that chart of how Intel's power/performance was going to move in the next several years vs IBM's PPC?



    It was almost unbelievable at the time, but well before Intel announced the first Yonah, and the Core chips, Apple knew.



    They always know.



    I was being a bit facetious perhaps. I was subtly (or not so subtly) implying Apple knew *exactly* WTF was going on at Nvidia, ATI and Intel. Which means Apple *knew* Intel couldn't produce any quad-core (Core 2 or Nehalem) for laptops in the 1st half of 2009. Apple *knew* Nvidia's GTX 200+ and ATI's RV770 would remain as big, long, hot desktop cards and won't be able to realistically bring this tech down to laptops in the 1st half of 2009.



    Which means Apple knew EXACTLY the iMac in its form factor would never be able to be launched in 1st half of 2009 with QuadCore or really strong graphics across the line* ~~~ which seems to be the big expectation ~~~ even expectations of Core i7...!



    *The Radeon 4850 is the real amazing thing in squeezing it into the 24" iMac, the saving grace of the new iMacs, if you will. This is a desktop Radeon 4850:



  • Reply 224 of 246
    jeffdmjeffdm Posts: 12,953member
    Quote:
    Originally Posted by nvidia2008 View Post


    *The Radeon 4850 is the real amazing thing in squeezing it into the 24" iMac, the saving grace of the new iMacs, if you will. This is a desktop Radeon 4850:



    Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture. I think there were compromises made with an older ATI chip put into iMacs a couple generations ago as well.
  • Reply 225 of 246
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by JeffDM View Post


    Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture.



    I think the board layout may be different just like the 8800GS... But that they are calling it a 4850 means it should still be a 4850 but probably underclocked, hopefully not crippled any other way.



    We won't really know until benchmarks are run.



    You have raised some interesting points nonetheless.
  • Reply 226 of 246
    MarvinMarvin Posts: 15,443moderator
    Quote:
    Originally Posted by vinea View Post


    CINEBENCH tests



    Mini: OpenGL 3246, Single Core Render: 2271, Multi Core Render: 4374




    I get on the same Mini Cinebench 10 but with 7200 rpm drive and 4GB RAM:



    Mini: OpenGL 3869, Single Core Render: 2199, Multi Core Render: 3904



    I didn't render from a clean boot and Safari seems to be using 10-15% CPU so that might explain the lower CPU. The OpenGL benchmark is what I was hoping. Using matched memory is supposed to help graphics performance the most as the VRAM is shared on it. 20-25% speedup is typical by using matched pairs of RAM.



    I guess Apple don't want a surplus of worthless 512MB modules after upgrades so they went with 1GB single modules in the low end but it looks like it affects graphics performance considerably. They should offer 2GB on the low end and a 4GB upgrade.



    The score you ran might also not be that low. You can hit the OpenGL standard button a number of times and average it out. The difference may not be as much as 20%.



    What is interesting is that the 9400M scores lower than the 2400XT in the old iMac. The 2400XT was 15% faster. This would suggest that Apple have downgraded the GPU in their refresh again. This may be for OpenCL compatibility but I reckon the 2400XT should be just as compatible.
  • Reply 227 of 246
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by Marvin View Post


    What is interesting is that the 9400M scores lower than the 2400XT in the old iMac. The 2400XT was 15% faster. This would suggest that Apple have downgraded the GPU in their refresh again. This may be for OpenCL compatibility but I reckon the 2400XT should be just as compatible.



    Yah, that's a shame but not unexpected when they went to an integrated solution for the low end iMacs. This way they also end up with more 9400M buys and a lower cost for everything.



    Of course, that doesn't really translate into savings on our part.
  • Reply 228 of 246
    melgrossmelgross Posts: 33,600member
    Quote:
    Originally Posted by nvidia2008 View Post


    W

    Well, that's the kind of innovation Apple should be producing. That's the kind of innovation we expect. An all-in-one that has a large screen (20"+) but is also somehow foldable and portable. How about the "desktop" is actually a unit that sits wirelessly between your TV and a screen which you can shift about the house. I mean, these are crazy ideas but Apple should be finding something that works and they have the potential to fully revolutionize the desktop computer.



    What do we want from a "desktop". What *is* a desktop? What is a computer? What is a screen? Why would I not want a laptop? This is the kind of R&D that I'm sure is going on but this 2009 I wonder if we will see the results of such R&D... or maybe R&D at Apple has shifted to other areas.







    I was being a bit facetious perhaps. I was subtly (or not so subtly) implying Apple knew *exactly* WTF was going on at Nvidia, ATI and Intel. Which means Apple *knew* Intel couldn't produce any quad-core (Core 2 or Nehalem) for laptops in the 1st half of 2009. Apple *knew* Nvidia's GTX 200+ and ATI's RV770 would remain as big, long, hot desktop cards and won't be able to realistically bring this tech down to laptops in the 1st half of 2009.



    Which means Apple knew EXACTLY the iMac in its form factor would never be able to be launched in 1st half of 2009 with QuadCore or really strong graphics across the line* ~~~ which seems to be the big expectation ~~~ even expectations of Core i7...!



    *The Radeon 4850 is the real amazing thing in squeezing it into the 24" iMac, the saving grace of the new iMacs, if you will. This is a desktop Radeon 4850:







    Ok, that's almost the exact opposite feeling I got from you former post. I somehow missed the sarcasm.



    I also think its pretty good to have that option. While some will never be happy, this is a pretty damn good upgrade.



    Something else than many don't realize, is that ATI's stuff is much better in 3D programs such as Motion etc, than are Nvidias stuff. Remember that the old cheap low end ATI board for the last Mac Pro does much better in Pro apps than the higher prived NVidea product. Even after Nvidia saw the bad publicity they were getting from that, and helped Apple improve the crappy drivers they always have, the performance was still notably worse.



    The same thing is true for hardware decoding and encoding of video. ATI's results are always better than NVidia's, this goes back a very long ways.
  • Reply 229 of 246
    melgrossmelgross Posts: 33,600member
    Quote:
    Originally Posted by JeffDM View Post


    Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture. I think there were compromises made with an older ATI chip put into iMacs a couple generations ago as well.



    The difference with the Nvidia was just the speed. The chips are the same.



    Some of that problem with the Nvidia chips had to do with the now infamous solder problems Nvidia has been, and apparently is still having. Too much heat makes the Nvidia product fail. We're now seeing this on some of the new 17" MBP's. PC products have been badly hit by this.



    ATI isn't having this problem.
  • Reply 230 of 246
    futurepastnowfuturepastnow Posts: 1,772member
    Quote:
    Originally Posted by JeffDM View Post


    Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture. I think there were compromises made with an older ATI chip put into iMacs a couple generations ago as well.



    The Mobility Radeon 4850, which the iMac almost certainly uses, has the same 800 stream processors as the desktop version. It's just clocked lower, run off a lower voltage.



    You're right that the 8800GS in the last iMac had no relation to the desktop 8800GS, though. Nvidia plays that game a lot with their mobile GPUs. That should not be the case with the 4850.
  • Reply 231 of 246
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by melgross View Post


    Something else than many don't realize, is that ATI's stuff is much better in 3D programs such as Motion etc, than are Nvidias stuff. Remember that the old cheap low end ATI board for the last Mac Pro does much better in Pro apps than the higher prived NVidea product. Even after Nvidia saw the bad publicity they were getting from that, and helped Apple improve the crappy drivers they always have, the performance was still notably worse.



    The same thing is true for hardware decoding and encoding of video. ATI's results are always better than NVidia's, this goes back a very long ways.



    1st gen of the Mac Pro that's not true.



    http://www.kenstone.net/fcp_homepage...s_mac_pro.html



    Core Image is not 3D. The 8800 killed the 2600XT in 3D. Core Image uses the shader language for pixel level effects. This isn't the same as 3D rendering ability. If you were a Motion user then the 2600XT was better. If you were a Maya user then the 8800GT was better. When the 8800GT came out the GLSL linker was broken in Forceware which is why performance was awesome in 3D apps and poor in Motion (and like 2 games...2nd life and Wurm). It was fixed at some point in Forceware and likely that's what ended up in the Leopard "graphics patch".



    Traditionally nVidia supported OpenGL better than ATI and I've always favored the Forceware over Catalyst (especially during that Catalyst 7.12 fiasco). Amusingly, ATI just borked GLSL support in Catalyst 9 after nVidia got their OpenGL 3.0/GLSL 1.30 house in order. There's a good bit of complaining about how bad the Catalyst 9.1/9.2 drivers are for OpenGL.



    But this is once again the "the only pro apps that count are the ones I use" syndrome. nVidia sucks at all Pro apps because it works poorly for Motion and Core Image based pro apps. Mmmmkay.



    On the plus side, it does show that the buy cheaper and upgrade more often strategy is superior in this scenario.
  • Reply 232 of 246
    So am I correct in seeing that all the base model Macs (iMac, mini, MacBook) now use the exact same NVIDIA GeForce 9400M chip with memory shared --well, allocated really-- from main RAM? In an amount of 256 MB, excepting for the ham-strung 1GB base mini, which shifts to 256 MB graphic RAM allocation as soon as you put in a second GB of main RAM.



    That (to me) really points out that the base mini model configuration is intentionally 'dumbed' down with only 1 GB RAM.



    As my ancient (and no longer working ) iMac G5 (ALS/2005) had a Radeon 9600 with dedicated 128 MB video RAM, I am wondering how much better (if any) the GeForce 9400M shared is.

    I am trying to figure out what my best option for a Mac replacement is now, given the recent minor speed revisions in the Early 2009 releases. (Yes, the new mini has big graphic improvement over old mini,,, but I am comparing to how it would seem to me visavis my iMac G5.)



    Are there any sites that do generalized Mac Video chip comparatives?
  • Reply 233 of 246
    nvidia2008nvidia2008 Posts: 9,262member
    Quick run through on my tests:



    Cinebench OpenGL test

    MacBook 2ghz 9400M

    ~4000

    iMac 20" 2.4ghz 2400XT

    ~4900



    Xbench

    MacBook 2ghz 9400M

    Quartz: 157 OpenGL: 138

    iMac 20" 2.4ghz 2400XT

    Quartz: 170 OpenGL: 180
  • Reply 234 of 246
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by Bruce Young View Post


    So am I correct in seeing that all the base model Macs (iMac, mini, MacBook) now use the exact same NVIDIA GeForce 9400M chip with memory shared --well, allocated really-- from main RAM? In an amount of 256 MB, excepting for the ham-strung 1GB base mini, which shifts to 256 MB graphic RAM allocation as soon as you put in a second GB of main RAM.



    That (to me) really points out that the base mini model configuration is intentionally 'dumbed' down with only 1 GB RAM.



    As my ancient (and no longer working ) iMac G5 (ALS/2005) had a Radeon 9600 with dedicated 128 MB video RAM, I am wondering how much better (if any) the GeForce 9400M shared is.

    I am trying to figure out what my best option for a Mac replacement is now, given the recent minor speed revisions in the Early 2009 releases. (Yes, the new mini has big graphic improvement over old mini,,, but I am comparing to how it would seem to me visavis my iMac G5.)



    Are there any sites that do generalized Mac Video chip comparatives?



    Whenever there is 1GB 9400M will use 128MB for VRAM. 2GB or higher~ it will use 256MB VRAM.



    The 9400M is definitely slower than the 2400XT.

    My tests:



    Cinebench OpenGL

    MacBook 2ghz 9400M

    ~4000

    iMac 20" 2.4ghz 2400XT

    ~4900



    Xbench

    MacBook 2ghz 9400M

    Quartz: 157 OpenGL: 138

    iMac 20" 2.4ghz 2400XT

    Quartz: 170 OpenGL: 180



    The iMac Alu 20" 2400XT or anything 9400M will probably be double the speed of your ATI 9600.



    Go for the previous generation iMac Alu (iMac 20" 2400XT 128MB VRAM). You can play most modern games and medium settings at 1680x1050 (without antialiasing) ~ overall reasonable application performance (iLife, iWork, Mac OS X Leopard, perhaps Aperture. Maybe even Final Cut Studio2 if you are starting out trying it out).
  • Reply 235 of 246
    vineavinea Posts: 5,585member
    Quote:
    Originally Posted by nvidia2008 View Post




    Xbench

    MacBook 2ghz 9400M

    Quartz: 157 OpenGL: 138

    iMac 20" 2.4ghz 2400XT

    Quartz: 170 OpenGL: 180



    My XBench scores on the 9400M were dismal. I don't recall what they were but they were lower than the GMA X3100 scores I saw posted.
  • Reply 236 of 246
    Quote:
    Originally Posted by nvidia2008 View Post


    Whenever there is 1GB 9400M will use 128MB for VRAM. 2GB or higher~ it will use 256MB VRAM.



    The 9400M is definitely slower than the 2400XT.



    I wouldn't say it is slower based on benchmarks like Xbench. Actual gameplay tests are what matters. Sometimes- and I'm not saying that will happen here- they tell a different story.
  • Reply 237 of 246
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by FuturePastNow View Post


    I wouldn't say it is slower based on benchmarks like Xbench. Actual gameplay tests are what matters. Sometimes- and I'm not saying that will happen here- they tell a different story.



    Agreed. Ideally I would have liked to bench games in XP like UT3, Left4Dead, etc. and bench in Mac games like C&C3 and NFS:Carbon, etc.



    I'm fairly confident (maybe call it a gut feeling) the 2400XT in the iMac will still perform better (15%?) then the 9400M. Being a discrete card with dedicated VRAM would give it the edge. Particularly once you start talking about DirectX9.0c ... Again, gut feeling here.



    For me the point is moot. Based on Marvin's tests on the Mac Mini, the 9400M is impressive as an integrated chip but if I go "back" to PC gaming (from not really gaming nowadays) ~ it is just not an option.



    Maybe I'm just trying to find excuses not to blow $1,000+ on any new Macs!
  • Reply 238 of 246
    nvidia2008nvidia2008 Posts: 9,262member
    Quote:
    Originally Posted by FuturePastNow View Post


    The Mobility Radeon 4850, which the iMac almost certainly uses, has the same 800 stream processors as the desktop version. It's just clocked lower, run off a lower voltage.



    You're right that the 8800GS in the last iMac had no relation to the desktop 8800GS, though. Nvidia plays that game a lot with their mobile GPUs. That should not be the case with the 4850.



    Interestingly on Apple Spec pages it is *not* called "Mobility". Just Radeon 4850. ATI playing Nvidia-style naming trickery? Or Apple?



    GT120 (rebranded 9500GT) and GT130 are listed as desktop cards. I wonder how crippled/underclocked they are (if any?) in the iMacs.
  • Reply 239 of 246
    melgrossmelgross Posts: 33,600member
    Quote:
    Originally Posted by vinea View Post


    1st gen of the Mac Pro that's not true.



    http://www.kenstone.net/fcp_homepage...s_mac_pro.html



    Core Image is not 3D. The 8800 killed the 2600XT in 3D. Core Image uses the shader language for pixel level effects. This isn't the same as 3D rendering ability. If you were a Motion user then the 2600XT was better. If you were a Maya user then the 8800GT was better. When the 8800GT came out the GLSL linker was broken in Forceware which is why performance was awesome in 3D apps and poor in Motion (and like 2 games...2nd life and Wurm). It was fixed at some point in Forceware and likely that's what ended up in the Leopard "graphics patch".



    Traditionally nVidia supported OpenGL better than ATI and I've always favored the Forceware over Catalyst (especially during that Catalyst 7.12 fiasco). Amusingly, ATI just borked GLSL support in Catalyst 9 after nVidia got their OpenGL 3.0/GLSL 1.30 house in order. There's a good bit of complaining about how bad the Catalyst 9.1/9.2 drivers are for OpenGL.



    But this is once again the "the only pro apps that count are the ones I use" syndrome. nVidia sucks at all Pro apps because it works poorly for Motion and Core Image based pro apps. Mmmmkay.



    On the plus side, it does show that the buy cheaper and upgrade more often strategy is superior in this scenario.



    It works faster in a number of apps besides Motion. From my own experience, it's faster in Archicad as well.



    Since many pro apps use Core Image, yes, poor performance is important. As many 3D pro apps use Core Image, its important for them.



    Even after the patch, it was still no better. Worse in some tests still. Not a good buy for pro users.



    We have one set of tests here:



    http://www.barefeats.com/harper10.html



    The later ones here.



    http://www.barefeats.com/imp04.html



    As Barefeats said, it would be better to buy the 3870 than the 8800GT for faster performance, and the 2600 Pro would still be equal.



    A more modern ATI card such as the 4870 still is much better than the currently best Nvidia for this purpose. Far better than a mid Nvidia.
  • Reply 240 of 246
    melgrossmelgross Posts: 33,600member
    Quote:
    Originally Posted by Bruce Young View Post


    So am I correct in seeing that all the base model Macs (iMac, mini, MacBook) now use the exact same NVIDIA GeForce 9400M chip with memory shared --well, allocated really-- from main RAM? In an amount of 256 MB, excepting for the ham-strung 1GB base mini, which shifts to 256 MB graphic RAM allocation as soon as you put in a second GB of main RAM.



    That (to me) really points out that the base mini model configuration is intentionally 'dumbed' down with only 1 GB RAM.



    As my ancient (and no longer working ) iMac G5 (ALS/2005) had a Radeon 9600 with dedicated 128 MB video RAM, I am wondering how much better (if any) the GeForce 9400M shared is.

    I am trying to figure out what my best option for a Mac replacement is now, given the recent minor speed revisions in the Early 2009 releases. (Yes, the new mini has big graphic improvement over old mini,,, but I am comparing to how it would seem to me visavis my iMac G5.)



    Are there any sites that do generalized Mac Video chip comparatives?



    It's not intentionally dumbed down. It's a cost issue. If it were dumbed down, then you couldn't add that extra DIMM at all.



    Try Barefeats. They do speed comparisons. You would have to look fairly far back for the G5.



    Macworld also does this. they will compare older machines with newer ones, but not across several generations. You would have to look at the older numbers, and assuming that the testing procedures haven't changed, compare them with the newer ones.
Sign In or Register to comment.