or Connect
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Benchmarks of 2009 iMacs, Mac minis show negligible speed-ups
New Posts  All Forums:Forum Nav:

Benchmarks of 2009 iMacs, Mac minis show negligible speed-ups - Page 6

post #201 of 247
Quote:
Originally Posted by vinea View Post

Well there are two schools of thought. That's one.

The other is to buy cheap, replace often. The only downside to that is Apple sometimes won't let you do the "upgrade often" part for the mini.

For video cards, that's been my strategy and it works a heck of a lot better than shelling out $500+ for a top end card. 3 revs later, the $99-$120 budget cards will work with more games anyway.

Same for projectors.

Like you say, that's one way of looking at it.

The way I do it, I do get higher performance for at least half of the lifetime of owning the machine. The other way, you never get high performance, but are always hovering around the middle.

Each way is valid. It just depends on how you use your machine.
post #202 of 247
Quote:
Originally Posted by TenoBell View Post

You are absolutely right a better CMOS sensor and better lens would make much more of a difference in picture quality than simply more megapixels. The public at large don't understand this because its inconvenient to marketing megapixels if the public really understood how imaging works.

Yes. When we get to the tiny sensors used in phone cameras, we're limited by s/n levels, limited dynamic range, and a host of other problems.

How much rez is possible before IQ is so low that it doesn't matter? I think 3.2 MP is ok. After all these are just for snapshots. a 6 x 8 print would be fine. 3.2 MP is also good enough to read a standard bar code, for those who would like payment programs and the like.

A lens that focusses might be handy, but truthfully, the depth of field on these tiny sensors is so great, that I haven't seen any advantage on friend's phones that do that.

Better lenses would help though. Too many of these cameras are terrible at the edges and esp. the corners.

But that's optical, so it means a bit more money. A good lens for this small format costs about $40. How many people would like to add another $35 to the cost of their phones for the better lens?
post #203 of 247
Quote:
Originally Posted by copeland View Post

[OPINION]Apple won't play the same clockspeed game when it can transition the iMac to Nehalem mobile.
After the iMac gets its update to the new platform, the mini's clockspeed will raise a bit (hopefully even up to 2.5GHz) and will stay there for at least another year. [/OPINION]

This sounds about right to me.
That is, even though I don't like their secret reasoning to keep the Mac Mini CPU speed pushed down low, it will probably not be raised till the iMac has been given a boost. And my guess is that will not happen for at least 8 months, given that both models were just revamped in this "Early 2009" release.

I would rather have seen them let the mini be more powerful by offering the higher cpu in the higher mini. Kind of like if they were interested in selling the greatest number of best value computers to people, rather than their obsession with fine-tweaking their product line into defined segments. I think in the long run, their intentional limiting of the lower end models loses more sales. just my opinion.
The Universe is Intelligent and Friendly
Reply
The Universe is Intelligent and Friendly
Reply
post #204 of 247
Quote:
Originally Posted by PB View Post

... the low-power quad chips are running at 65 W at least, which is more than the G5 used in the (then noisy) iMac.

The real problem here is the really slim form factor of the iMac. You cannot do much if the computer is like that.

Quote:
Originally Posted by PB View Post

... Apple has two choices: either keep the current desing and use low-power laptop chips, with whatever this may mean, or introduce a new machine with more volume and use normal desktop parts. But Apple likes to innovate and apparently has not found an original way to materialize the second option, most probably because there is no other than the usual one adopted by pretty much everyone else. So there remains the first option...

The Apple obsession with THIN as a design and marketing goal is really working to their disadvantage, in my opinion.

In the iMac, there is really no user value gained by having the iMac case be thinner in each newer generation. As I look at my own white G5 iMac, I never notice how thick or thin it is. The only usage issue is, say, weight (for when I occasionally move it around) or screen real-estate area.
If they would just drop the stupid obsession with thinness, they could have an iMac that is the same height-width for the monitor, and it could be just a tad thicker to allow for increased air flow. Thus, allowing more high end (warmer) parts to be used and still achieve good cooling -- as there would be more volume for air to move through without having to have wind-tunnel fans.

In the laptops, thin may have some value, until you get to the point where you loose structural stiffness, or start throwing away ports and parts (ala the Macbook Air). but the notebook topic is another thread.

I am not sure when "Thin" became the Apple Mantra.
The Universe is Intelligent and Friendly
Reply
The Universe is Intelligent and Friendly
Reply
post #205 of 247
Quote:
I am not sure when "Thin" became the Apple Mantra.

(adding: )
In a flight of fancy, I imagine a possible Apple Engineering/Design meeting going like this:

Engineer/Designers: Hey, we have figured out a way to have the next iMac revision use the latest CPU and GPU chips, and still be quiet and cool (temerature) - but it will have to be 1/2-inch thicker to allow for more air flow around the parts.

Steve Somebody: NO! Dammit. It's got to be THIN! Thinner than the last one. That is what is important -- Being Thin!!
I don't care if it has to use last years chips or this year's low-end chips. It's just got to be THIN!
The Universe is Intelligent and Friendly
Reply
The Universe is Intelligent and Friendly
Reply
post #206 of 247
Quote:
Originally Posted by melgross View Post

Like you say, that's one way of looking at it.

The way I do it, I do get higher performance for at least half of the lifetime of owning the machine. The other way, you never get high performance, but are always hovering around the middle.

Each way is valid. It just depends on how you use your machine.

And I get higher performance than you for the second half at a lower initial cost and my machine is typically under warranty for the whole period.
post #207 of 247
Quote:
Originally Posted by Bruce Young View Post

I am not sure when "Thin" became the Apple Mantra.

I don't know if there is any one particular point. Their notebooks have been getting progressively thinner with every major redesign, maybe the emate was an exception, I think that was pretty thin.

Since iMac G5, they've gotten thinner, at least at the edges. iMac G5 had a thicker panel than the iMac G4, but that was a major shift, I think getting rid of the semi-bowling ball was a good idea though. Every iPod since the first 10GB model was thinner too. iphone 3g was a tiny bit thicker in the middle than the original, but being thinner at the edges make it seem thinner.
post #208 of 247
Macworld: Benchmarks: New Mac minis http://www.macworld.com/article/1392...mini_2009.html

Though the new systems appear identical to the previous Mac mini () on the surface, there are some important changes internally—changes that have a positive impact to the tune of a 21 percent increase in overall system performance, according to our testing.

So much for depending solely on clock speed to make a decision.
post #209 of 247
Quote:
Originally Posted by vinea View Post

And I get higher performance than you for the second half at a lower initial cost and my machine is typically under warranty for the whole period.

You may, but you may not. It depends on the difference in performance from the beginning, and how long it's kept.

It isn't always an issue of cost. I usually keep video cards (or at least I did when they were easier to get) until the performance drops to the level of just above the middle range of the latest series. We may be able to do that again.

With computers, it used to be easy to upgrade the cpu to move two generations ahead. with the requirements of the G5's cooling, it became impossible to replace them, and no third party ever tried.

With Intel chips, we can never go back to where we were with the G4 and earlier, but we can move current chips off, and use chips from the "tick" of the newer process, gaining a fairly large amount of performance. I know a number of people who did that with their Mac Pros. I'm planning on doing it with my new one in a couple of years, just before the next "tock".

I've only needed Apple's warranty once, long ago, for my 950. Other than that, none of my many machines, both home and at work have ever failed.
post #210 of 247
Quote:
Originally Posted by Abster2core View Post

So your are telling me that you pick your computer simply and only by its clock speed.

No he's saying that clock speed, graphics, etc. are all important for some market segments, eg. "enthusiasts". These enthusiasts use 3DMark, etc. and usually go for the latest, or the mid-range with the best bang-for-buck. So these enthusiasts care about clock speed, number of cores, graphic cards for their chosen or future screen resolutions.

The "casual" user wouldn't care or understand about clock speed and so on but enthusiasts are more clued-in.
post #211 of 247
Quote:
Originally Posted by melgross View Post

You may, but you may not. It depends on the difference in performance from the beginning, and how long it's kept.

It isn't always an issue of cost. I usually keep video cards (or at least I did when they were easier to get) until the performance drops to the level of just above the middle range of the latest series. We may be able to do that again.

Example:

7900GTX - 2006 High end card. $500.

vs

7600 GT - 2006 Mid grade card. $200.
9600 GT - 2008 Mid grade card. $189 (beats the 7900GTX)

In two years I have a DX10 card that's faster than the 7900 GTX.

Now if you need the frame rates of a $500 card, you need the frame rates. Buying a $200 card just doesn't cut it. But damn few really do...and ponying up the money so you can make the card last 4 years is not cost effective nor gives you very good performance after year 2.

Quote:
With computers, it used to be easy to upgrade the cpu to move two generations ahead. with the requirements of the G5's cooling, it became impossible to replace them, and no third party ever tried.

It was never easy or all that cost effective. And you never jumped two generations but at most one and a speed bump. Chip packages changed as often as they do today.

Quote:
With Intel chips, we can never go back to where we were with the G4 and earlier, but we can move current chips off, and use chips from the "tick" of the newer process, gaining a fairly large amount of performance.

Gaining a moderate amount of performance.

Quote:
I know a number of people who did that with their Mac Pros. I'm planning on doing it with my new one in a couple of years, just before the next "tock".

Are you seriously suggesting that upgrading the same generation Xeon Precision Workstations (as those Mac Pros) from Dell would actually be more cost effective than buying strategy buying a mid end $1500 Dell Conroe box and then a mid-end $1500 Dell Core i7 box?

Say you wanted to upgrade your Xeon 2.66Ghz octo to a 3.0Ghz octo today (non-Nehalem). That's $929 from Newegg for one X5450.

You're seriously going to drop nearly $2K for .34Ghz upgrade?

Quote:
I've only needed Apple's warranty once, long ago, for my 950. Other than that, none of my many machines, both home and at work have ever failed.

Yes, so warranties are worthless in the equation.
post #212 of 247
CINEBENCH tests

Mini: OpenGL 3246, Single Core Render: 2271, Multi Core Render: 4374
MBP: OpenGL 3118, Single Core Render: 2127, Multi Core Render: 3988

Mini: 2.0Ghz Core 2 Duo, 1GB RAM, 9400M
MBP: 2.16Ghz Core 2, 2GB RAM, X1600

The new base mini is good enough for light gaming even with no upgrades. The score should go up a little when I put a pair of memory sticks in there and bump it to 4GB.

I should run my 1st gen Mac Pro as well sometime. My old mini is now sitting disconnected on the floor and I'm too lazy to hook it back up. Maybe later...but a Santa Rosa Mini would be more interesting anyway.

-----

CINEBENCH R10
************************************************** **

Tester : NA

Processor : Intel Core 2 Duo
MHz : 2.0
Number of CPUs : 2
Operating System : OS X 32 BIT 10.5.6

Graphics Card : NVIDIA GeForce 9400 OpenGL Engine
Resolution : <fill this out>
Color Depth : <fill this out>

************************************************** **

Rendering (Single CPU): 2271 CB-CPU
Rendering (Multiple CPU): 4374 CB-CPU

Multiprocessor Speedup: 1.93

Shading (OpenGL Standard) : 3246 CB-GFX


************************************************** **

CINEBENCH R10
************************************************** **

Tester : NA

Processor : Intel Core 2
MHz : 2.16
Number of CPUs : 2
Operating System : OS X 32 BIT 10.5.6

Graphics Card : ATI Radeon X1600 OpenGL Engine
Resolution : <fill this out>
Color Depth : <fill this out>

************************************************** **

Rendering (Single CPU): 2127 CB-CPU
Rendering (Multiple CPU): 3988 CB-CPU

Multiprocessor Speedup: 1.87

Shading (OpenGL Standard) : 3118 CB-GFX


************************************************** **
post #213 of 247
Quote:
Originally Posted by copeland View Post

[OPINION]
I think you're right. Apple had to keep even the higher priced Mac mini at 2 GHz in the standard configuration to differentiate the iMac with clockspeed.
Apple won't play the same clockspeed game when it can transition the iMac to Nehalem mobile.
After the iMac gets its update to the new platform, the mini's clockspeed will raise a bit (hopefully even up to 2.5GHz) and will stay there for at least another year.
[/OPINION]

What do you mean by that?

Are you referring to jobs denunciation of using clock speed to prove superiority as he did in the 2001 article, "Macs are not about megahertz, says Jobs"* or in 1997 when he introduced the PowerPC G3"?

*http://news.zdnet.co.uk/hardware/0,1...2091647,00.htm

Megahertz Myth; Rise of the Myth. http://en.wikipedia.org/wiki/Megahertz_myth
post #214 of 247
Quote:
Originally Posted by Abster2core View Post

What do you mean by that?

Are you referring to jobs denunciation of using clock speed to prove superiority as he did in the 2001 article, "Macs are not about megahertz, says Jobs"* or in 1997 when he introduced the PowerPC G3"?

*http://news.zdnet.co.uk/hardware/0,1...2091647,00.htm

Megahertz Myth; Rise of the Myth. http://en.wikipedia.org/wiki/Megahertz_myth

He's saying there's little difference between the mini and the low end iMac. Had they made the top end Mini 2.26 it would be even more obvious. However, after the mobile Nehalems arrive, the iMacs go to them while the Minis stay Penryn. Then maybe the top end mini (and maybe even the bottom one) get a speed bump to 2.26 Ghz or more. No need to keep the mini capped at a low clock speed.
post #215 of 247
Quote:
Originally Posted by Abster2core View Post

What do you mean by that?

Are you referring to jobs denunciation of using clock speed to prove superiority as he did in the 2001 article, "Macs are not about megahertz, says Jobs"* or in 1997 when he introduced the PowerPC G3"?

*http://news.zdnet.co.uk/hardware/0,1...2091647,00.htm

Megahertz Myth; Rise of the Myth. http://en.wikipedia.org/wiki/Megahertz_myth

No more iPhone typing for me. Geeeeeez.

While true Apple always tried to expose the myth, it wasn't until AMD came around with the Athlon and intel needed another 1.0ghz to match AMD's benchmark results. That said, it wasn't until Apple switched to intel that they truly became competitive on a hertz by hertz level. Like AMD, apple had their Alvertic (spelling), that helped encoding, rendering, speed, plug-INS, and so on, but at the time, there was always a faster PC counter part and could be built at a fraction of the price. Ironically, like today. Apple is falling behind compared to the market, albeit desktops simply aren't selling.
post #216 of 247
I agree. People don't realize the only iMac worth getting is at least the $1,799 model with the dedicated video card in it, the 9400 is a joke. Plus the previous models all used 4GB of ram, these can only use 3.75 because the 9400 eats up to 256MB of it, how is that 'more bang for the buck'??
You win, I've switched sides.
Reply
You win, I've switched sides.
Reply
post #217 of 247
Quote:
Originally Posted by vinea View Post

Example:

7900GTX - 2006 High end card. $500.

vs

7600 GT - 2006 Mid grade card. $200.
9600 GT - 2008 Mid grade card. $189 (beats the 7900GTX)

In two years I have a DX10 card that's faster than the 7900 GTX.

Now if you need the frame rates of a $500 card, you need the frame rates. Buying a $200 card just doesn't cut it. But damn few really do...and ponying up the money so you can make the card last 4 years is not cost effective nor gives you very good performance after year 2.

That $200 card doesn't make it. It would be a waste. I upgrade my cards more quickly than my computers.

Quote:
It was never easy or all that cost effective. And you never jumped two generations but at most one and a speed bump. Chip packages changed as often as they do today.

That's not quite true.

Quote:
Gaining a moderate amount of performance.

Not true again.

Going from the current dual 2.66 GHz 4 core chip configuration I ordered, to a dual 3.3 GHz 6 core chip configuration will give a lot of oomph. The later chips will also do more than two speed bumps in turbo mode. Three at least.

Quote:
Are you seriously suggesting that upgrading the same generation Xeon Precision Workstations (as those Mac Pros) from Dell would actually be more cost effective than buying strategy buying a mid end $1500 Dell Conroe box and then a mid-end $1500 Dell Core i7 box?

I have no interest in buying a Windows based home machine. None at all.

Quote:
Say you wanted to upgrade your Xeon 2.66Ghz octo to a 3.0Ghz octo today (non-Nehalem). That's $929 from Newegg for one X5450.

You're seriously going to drop nearly $2K for .34Ghz upgrade?

That's not the upgrade.

Quote:
Yes, so warranties are worthless in the equation.

They're insurance. I would buy Applecare for a laptop. For well over 200 machines over the years (most for my business of course), I've only needed the warranty once.
post #218 of 247
Quote:
Originally Posted by hiimamac View Post

No more iPhone typing for me. Geeeeeez.

While true Apple always tried to expose the myth, it wasn't until AMD came around with the Athlon and intel needed another 1.0ghz to match AMD's benchmark results. That said, it wasn't until Apple switched to intel that they truly became competitive on a hertz by hertz level. Like AMD, apple had their Alvertic (spelling), that helped encoding, rendering, speed, plug-INS, and so on, but at the time, there was always a faster PC counter part and could be built at a fraction of the price. Ironically, like today. Apple is falling behind compared to the market, albeit desktops simply aren't selling.

The megahertz (I suppose today it's the Gigahertz) myth is only true across processor families. It's not true within processor families. some people do forget that.

But even so, it's only half a myth. Once you know the performance per "Hertz", you can figure out how they compare.
post #219 of 247
It is strange. If you look at the 24" LED Cinema Display, it actually looks and feels quite solid, and even somewhat bulky in some ways compared to Samsungs and LGs around the 22" and 24" mark.

Maybe Apple, as brilliant as the team is, painted themselves into a corner with the iMac Aluminium and its thinness. Maybe they did not anticipate people like Nvidia struggling to go down to 40nm ~ look at the latest generation of Nvidia's GPUs - GTX260 and 285, etc. Dual slot, big, hot, each drawing 100+Watts at load!... (I know, they're desktop parts and ATI isn't that much better in that area though the RV770 is very impressive)... Maybe they did not anticipate (and actually I don't know what the expectation was) low power quadcore Core 2 laptop CPUs by end of 2008. Maybe. But unlikely.

The thing is, the iMac has always been the most important and highest selling Mac... until about last year or a bit before that, wherever the turning point was that laptops eclipsed desktops and the Macbook usurped the legendary iMac.

Clearly the Apple iMac requires a revolution in and of itself, with regard to Apple redefining the desktop computer. Evolutionary progress on Mac Mini and iMac, well, I guess a safe bet in these times. US unemployment at 8% currently, IIRC.

Quote:
Originally Posted by JeffDM View Post

I don't know if there is any one particular point. Their notebooks have been getting progressively thinner with every major redesign, maybe the emate was an exception, I think that was pretty thin.

Since iMac G5, they've gotten thinner, at least at the edges. iMac G5 had a thicker panel than the iMac G4, but that was a major shift, I think getting rid of the semi-bowling ball was a good idea though. Every iPod since the first 10GB model was thinner too. iphone 3g was a tiny bit thicker in the middle than the original, but being thinner at the edges make it seem thinner.

Quote:
Originally Posted by Bruce Young View Post

(adding: )
In a flight of fancy, I imagine a possible Apple Engineering/Design meeting going like this:

Engineer/Designers: Hey, we have figured out a way to have the next iMac revision use the latest CPU and GPU chips, and still be quiet and cool (temerature) - but it will have to be 1/2-inch thicker to allow for more air flow around the parts.

Steve Somebody: NO! Dammit. It's got to be THIN! Thinner than the last one. That is what is important -- Being Thin!!
I don't care if it has to use last years chips or this year's low-end chips. It's just got to be THIN!

Quote:
Originally Posted by Bruce Young View Post

The Apple obsession with THIN as a design and marketing goal is really working to their disadvantage, in my opinion.

In the iMac, there is really no user value gained by having the iMac case be thinner in each newer generation. As I look at my own white G5 iMac, I never notice how thick or thin it is. The only usage issue is, say, weight (for when I occasionally move it around) or screen real-estate area.
If they would just drop the stupid obsession with thinness, they could have an iMac that is the same height-width for the monitor, and it could be just a tad thicker to allow for increased air flow. Thus, allowing more high end (warmer) parts to be used and still achieve good cooling -- as there would be more volume for air to move through without having to have wind-tunnel fans.

In the laptops, thin may have some value, until you get to the point where you loose structural stiffness, or start throwing away ports and parts (ala the Macbook Air). but the notebook topic is another thread.

I am not sure when "Thin" became the Apple Mantra.
post #220 of 247
Quote:
Originally Posted by melgross View Post

That $200 card doesn't make it. It would be a waste. I upgrade my cards more quickly than my computers.

Really? Pray tell what you're doing that requires the 7900GTX? Which is completely outclassed by the 9600 GT barely two years later?

Quote:
Going from the current dual 2.66 GHz 4 core chip configuration I ordered, to a dual 3.3 GHz 6 core chip configuration will give a lot of oomph. The later chips will also do more than two speed bumps in turbo mode. Three at least.

Well, The Gulftowns (desktop chip, not Xeon) will still be LGA 1366 but Intel still isn't sure if it will work with existing X58 boards. That uncertainty carries to the Xeons. intels' 6 core Dunningtons are Caneland processors, not Seaburgs.

Beckton is the Nehalem 8 core and I dunno that it will be compatible with the Gainstowns (doubtful). It seems likely that Intel might do a 6 core Nehalem Xeon even if it hasn't announced it yet. But there's no certainty that they will be clocked very high. The Dunningtons top at 2.66Ghz despite having 3.4Ghz Harpertowns and 2.93 Tigertons.

Given the roadmap has Gainstown at 3.2Ghz at the top end, I wouldn't expect it to be much more than a Dunnington-like Nehalem...that may be destined for 4-way and up servers and not workstations like the Mac Pro.

So the odds of you upgrading to 6 core 3.3Ghz chips in your current Mac Pro seems to be 50-50 at best. Either it will be not much faster than 2.66Ghz or it will be for the other Xeon line or both like the Dunningtons...which mostly run in blade servers and other high density applications.

And in any case it will likely cost you $1600+ per chip on the retail market. Which is what the 3.4 X5492s cost today. You're seriously going to drop $3200 for that upgrade when you can get a new Mac Pro instead?

Even if the top end Mac Pro is $6K who on earth thinks that's a good deal? God help you if you manage to munge up the install and break something in the process. You now have a 5K+ doorstop or a $1600 piece of costume jewelry.

http://www.google.com/products/catal...402#ps-sellers

Quote:
I have no interest in buying a Windows based home machine. None at all.

That's not the point. That Apple makes it hard to do the buy cheaper and upgrade more often is an issue but the general strategy is sound.

Quote:
That's not the upgrade.

That's the CURRENT upgrade. If not this then what did you many friends upgrade from and what did they upgrade to?
post #221 of 247
Quote:
Originally Posted by nvidia2008 View Post

It is strange. If you look at the 24" LED Cinema Display, it actually looks and feels quite solid, and even somewhat bulky in some ways compared to Samsungs and LGs around the 22" and 24" mark.

Maybe Apple, as brilliant as the team is, painted themselves into a corner with the iMac Aluminium and its thinness. Maybe they did not anticipate people like Nvidia struggling to go down to 40nm ~ look at the latest generation of Nvidia's GPUs - GTX260 and 285, etc. Dual slot, big, hot, each drawing 100+Watts at load!... (I know, they're desktop parts and ATI isn't that much better in that area though the RV770 is very impressive)... Maybe they did not anticipate (and actually I don't know what the expectation was) low power quadcore Core 2 laptop CPUs by end of 2008. Maybe. But unlikely.

The thing is, the iMac has always been the most important and highest selling Mac... until about last year or a bit before that, wherever the turning point was that laptops eclipsed desktops and the Macbook usurped the legendary iMac.

Clearly the Apple iMac requires a revolution in and of itself, with regard to Apple redefining the desktop computer. Evolutionary progress on Mac Mini and iMac, well, I guess a safe bet in these times. US unemployment at 8% currently, IIRC.

You can be sure that Apple knows of every development its partners are undergoing well before we hear of it.

Why do you think they went from the PPC when the G5 was running much cooler than the Prescott, and IBM was moving it up in speed faster than the Prescott was being ramped up?

Remember how shocked we were when Jobs showed that chart of how Intel's power/performance was going to move in the next several years vs IBM's PPC?

It was almost unbelievable at the time, but well before Intel announced the first Yonah, and the Core chips, Apple knew.

They always know.
post #222 of 247
Quote:
Originally Posted by nvidia2008 View Post

The thing is, the iMac has always been the most important and highest selling Mac... until about last year or a bit before that, wherever the turning point was that laptops eclipsed desktops and the Macbook usurped the legendary iMac.

Apple's notebook sales really began to dramatically outpace its desktop sales in 2006.

Quote:
Clearly the Apple iMac requires a revolution in and of itself, with regard to Apple redefining the desktop computer. Evolutionary progress on Mac Mini and iMac, well, I guess a safe bet in these times. US unemployment at 8% currently, IIRC.

Unless Apple sticks a rechargeable battery and a handle on the iMac, there's nothing revolutionary to be done to change its future direction.
post #223 of 247
Quote:
Originally Posted by vinea View Post

Really? Pray tell what you're doing that requires the 7900GTX? Which is completely outclassed by the 9600 GT barely two years later?

I use Archicad, among other, such as Motion with FCS. I imagine that PS ill also get a boost after 10.6 comes out.

Quote:
Well, The Gulftowns (desktop chip, not Xeon) will still be LGA 1366 but Intel still isn't sure if it will work with existing X58 boards. That uncertainty carries to the Xeons. intels' 6 core Dunningtons are Caneland processors, not Seaburgs.

The Gulftown is a server, desktop chip, and it uses the same LGA-1366 socket. It will work, at least according to Anandtech, who usually gets these thing right.

Quote:
Beckton is the Nehalem 8 core and I dunno that it will be compatible with the Gainstowns (doubtful). It seems likely that Intel might do a 6 core Nehalem Xeon even if it hasn't announced it yet. But there's no certainty that they will be clocked very high. The Dunningtons top at 2.66Ghz despite having 3.4Ghz Harpertowns and 2.93 Tigertons.

The Becton is not a drop-in part, as it uses the LGA-1567 socket. Apple would need a new mobo for that. If Apple is still intending to bring the Mac Pro into higher territory, they may go that way next year, but I don't think so.

Quote:
Given the roadmap has Gainstown at 3.2Ghz at the top end, I wouldn't expect it to be much more than a Dunnington-like Nehalem...that may be destined for 4-way and up servers and not workstations like the Mac Pro.

If you're still talking about Becton here, I agree, though the highest speeds are malleable right now, as Intel's been making excellent progress in ramping up. It's happened much more quickly than any new architecture in recent history, with fewer tapeouts. It's thought that we may see 3.4 GHz.

Quote:
So the odds of you upgrading to 6 core 3.3Ghz chips in your current Mac Pro seems to be 50-50 at best. Either it will be not much faster than 2.66Ghz or it will be for the other Xeon line or both like the Dunningtons...which mostly run in blade servers and other high density applications.

The odds are actually rather excellent.

Quote:
And in any case it will likely cost you $1600+ per chip on the retail market. Which is what the 3.4 X5492s cost today. You're seriously going to drop $3200 for that upgrade when you can get a new Mac Pro instead?

We'll see about the prices, which haven't been set yet. but, I saved $1,200 by sticking with the 2.66 for now, so it's not that much of a stretch for me. If spending another $3,200 rather than another $5,000+ is the choice, you bet!

Quote:
Even if the top end Mac Pro is $6K who on earth thinks that's a good deal? God help you if you manage to munge up the install and break something in the process. You now have a 5K+ doorstop or a $1600 piece of costume jewelry.

A good deal is in the wallet of the buyer. I like these machines very much. As I'm retired, I didn't buy a new machine every three years, and updated in between as I always used to do. But now I've decided to go for it, as I've been waiting for these chips. I'm also doing work I haven't done for a few years.

I've converted a number of Mac Pros to 4 core, and never had a problem, though it's a pain. I know of others who have done it as well. These new machines look MUCH easier to work on.

Quote:
That's not the point. That Apple makes it hard to do the buy cheaper and upgrade more often is an issue but the general strategy is sound.

That's exactly the point, because I'm not talking about buying a cheaper machine, I'm talking about the difference between buying a low end version of the same model, vs the higher end version.

So, for me, the difference would be between buying the single core 2.26 Mac Pro, or the dual 2.66 mac Pro, which is what I did order.

It's NEVER a question of going for something else.

Quote:
That's the CURRENT upgrade. If not this then what did you many friends upgrade from and what did they upgrade to?

The way you wrote that, it seemed as though you were saying something a bit different. For people who bought an earlier model of the Mac Pro, with the older, slower, 2 core chips, they could upgrade to the faster 4 core chips, whichever number they may have. That's quite an upgrade.
post #224 of 247
Quote:
Originally Posted by TenoBell View Post

Apple's notebook sales really began to dramatically outpace its desktop sales in 2006.

Which means maybe they saw the first iMac Aluminium as the last true Boom! of the prominent iMac line.

Quote:
Originally Posted by TenoBell View Post

Unless Apple sticks a rechargeable battery and a handle on the iMac, there's nothing revolutionary to be done to change its future direction.

Well, that's the kind of innovation Apple should be producing. That's the kind of innovation we expect. An all-in-one that has a large screen (20"+) but is also somehow foldable and portable. How about the "desktop" is actually a unit that sits wirelessly between your TV and a screen which you can shift about the house. I mean, these are crazy ideas but Apple should be finding something that works and they have the potential to fully revolutionize the desktop computer.

What do we want from a "desktop". What *is* a desktop? What is a computer? What is a screen? Why would I not want a laptop? This is the kind of R&D that I'm sure is going on but this 2009 I wonder if we will see the results of such R&D... or maybe R&D at Apple has shifted to other areas.

Quote:
Originally Posted by melgross View Post

You can be sure that Apple knows of every development its partners are undergoing well before we hear of it.

Why do you think they went from the PPC when the G5 was running much cooler than the Prescott, and IBM was moving it up in speed faster than the Prescott was being ramped up?

Remember how shocked we were when Jobs showed that chart of how Intel's power/performance was going to move in the next several years vs IBM's PPC?

It was almost unbelievable at the time, but well before Intel announced the first Yonah, and the Core chips, Apple knew.

They always know.

I was being a bit facetious perhaps. I was subtly (or not so subtly) implying Apple knew *exactly* WTF was going on at Nvidia, ATI and Intel. Which means Apple *knew* Intel couldn't produce any quad-core (Core 2 or Nehalem) for laptops in the 1st half of 2009. Apple *knew* Nvidia's GTX 200+ and ATI's RV770 would remain as big, long, hot desktop cards and won't be able to realistically bring this tech down to laptops in the 1st half of 2009.

Which means Apple knew EXACTLY the iMac in its form factor would never be able to be launched in 1st half of 2009 with QuadCore or really strong graphics across the line* ~~~ which seems to be the big expectation ~~~ even expectations of Core i7...!

*The Radeon 4850 is the real amazing thing in squeezing it into the 24" iMac, the saving grace of the new iMacs, if you will. This is a desktop Radeon 4850:

post #225 of 247
Quote:
Originally Posted by nvidia2008 View Post

*The Radeon 4850 is the real amazing thing in squeezing it into the 24" iMac, the saving grace of the new iMacs, if you will. This is a desktop Radeon 4850:

Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture. I think there were compromises made with an older ATI chip put into iMacs a couple generations ago as well.
post #226 of 247
Quote:
Originally Posted by JeffDM View Post

Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture.

I think the board layout may be different just like the 8800GS... But that they are calling it a 4850 means it should still be a 4850 but probably underclocked, hopefully not crippled any other way.

We won't really know until benchmarks are run.

You have raised some interesting points nonetheless.
post #227 of 247
Quote:
Originally Posted by vinea View Post

CINEBENCH tests

Mini: OpenGL 3246, Single Core Render: 2271, Multi Core Render: 4374

I get on the same Mini Cinebench 10 but with 7200 rpm drive and 4GB RAM:

Mini: OpenGL 3869, Single Core Render: 2199, Multi Core Render: 3904

I didn't render from a clean boot and Safari seems to be using 10-15% CPU so that might explain the lower CPU. The OpenGL benchmark is what I was hoping. Using matched memory is supposed to help graphics performance the most as the VRAM is shared on it. 20-25% speedup is typical by using matched pairs of RAM.

I guess Apple don't want a surplus of worthless 512MB modules after upgrades so they went with 1GB single modules in the low end but it looks like it affects graphics performance considerably. They should offer 2GB on the low end and a 4GB upgrade.

The score you ran might also not be that low. You can hit the OpenGL standard button a number of times and average it out. The difference may not be as much as 20%.

What is interesting is that the 9400M scores lower than the 2400XT in the old iMac. The 2400XT was 15% faster. This would suggest that Apple have downgraded the GPU in their refresh again. This may be for OpenCL compatibility but I reckon the 2400XT should be just as compatible.
post #228 of 247
Quote:
Originally Posted by Marvin View Post

What is interesting is that the 9400M scores lower than the 2400XT in the old iMac. The 2400XT was 15% faster. This would suggest that Apple have downgraded the GPU in their refresh again. This may be for OpenCL compatibility but I reckon the 2400XT should be just as compatible.

Yah, that's a shame but not unexpected when they went to an integrated solution for the low end iMacs. This way they also end up with more 9400M buys and a lower cost for everything.

Of course, that doesn't really translate into savings on our part.
post #229 of 247
Quote:
Originally Posted by nvidia2008 View Post

W
Well, that's the kind of innovation Apple should be producing. That's the kind of innovation we expect. An all-in-one that has a large screen (20"+) but is also somehow foldable and portable. How about the "desktop" is actually a unit that sits wirelessly between your TV and a screen which you can shift about the house. I mean, these are crazy ideas but Apple should be finding something that works and they have the potential to fully revolutionize the desktop computer.

What do we want from a "desktop". What *is* a desktop? What is a computer? What is a screen? Why would I not want a laptop? This is the kind of R&D that I'm sure is going on but this 2009 I wonder if we will see the results of such R&D... or maybe R&D at Apple has shifted to other areas.



I was being a bit facetious perhaps. I was subtly (or not so subtly) implying Apple knew *exactly* WTF was going on at Nvidia, ATI and Intel. Which means Apple *knew* Intel couldn't produce any quad-core (Core 2 or Nehalem) for laptops in the 1st half of 2009. Apple *knew* Nvidia's GTX 200+ and ATI's RV770 would remain as big, long, hot desktop cards and won't be able to realistically bring this tech down to laptops in the 1st half of 2009.

Which means Apple knew EXACTLY the iMac in its form factor would never be able to be launched in 1st half of 2009 with QuadCore or really strong graphics across the line* ~~~ which seems to be the big expectation ~~~ even expectations of Core i7...!

*The Radeon 4850 is the real amazing thing in squeezing it into the 24" iMac, the saving grace of the new iMacs, if you will. This is a desktop Radeon 4850:


Ok, that's almost the exact opposite feeling I got from you former post. I somehow missed the sarcasm.

I also think its pretty good to have that option. While some will never be happy, this is a pretty damn good upgrade.

Something else than many don't realize, is that ATI's stuff is much better in 3D programs such as Motion etc, than are Nvidias stuff. Remember that the old cheap low end ATI board for the last Mac Pro does much better in Pro apps than the higher prived NVidea product. Even after Nvidia saw the bad publicity they were getting from that, and helped Apple improve the crappy drivers they always have, the performance was still notably worse.

The same thing is true for hardware decoding and encoding of video. ATI's results are always better than NVidia's, this goes back a very long ways.
post #230 of 247
Quote:
Originally Posted by JeffDM View Post

Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture. I think there were compromises made with an older ATI chip put into iMacs a couple generations ago as well.

The difference with the Nvidia was just the speed. The chips are the same.

Some of that problem with the Nvidia chips had to do with the now infamous solder problems Nvidia has been, and apparently is still having. Too much heat makes the Nvidia product fail. We're now seeing this on some of the new 17" MBP's. PC products have been badly hit by this.

ATI isn't having this problem.
post #231 of 247
Quote:
Originally Posted by JeffDM View Post

Might it be a special version that's not the same as the desktop chip? I thought people were saying the old 8800 in the iMac wasn't as fast as the regular desktop chip, or even necessarily the same microarchitecture. I think there were compromises made with an older ATI chip put into iMacs a couple generations ago as well.

The Mobility Radeon 4850, which the iMac almost certainly uses, has the same 800 stream processors as the desktop version. It's just clocked lower, run off a lower voltage.

You're right that the 8800GS in the last iMac had no relation to the desktop 8800GS, though. Nvidia plays that game a lot with their mobile GPUs. That should not be the case with the 4850.
post #232 of 247
Quote:
Originally Posted by melgross View Post

Something else than many don't realize, is that ATI's stuff is much better in 3D programs such as Motion etc, than are Nvidias stuff. Remember that the old cheap low end ATI board for the last Mac Pro does much better in Pro apps than the higher prived NVidea product. Even after Nvidia saw the bad publicity they were getting from that, and helped Apple improve the crappy drivers they always have, the performance was still notably worse.

The same thing is true for hardware decoding and encoding of video. ATI's results are always better than NVidia's, this goes back a very long ways.

1st gen of the Mac Pro that's not true.

http://www.kenstone.net/fcp_homepage...s_mac_pro.html

Core Image is not 3D. The 8800 killed the 2600XT in 3D. Core Image uses the shader language for pixel level effects. This isn't the same as 3D rendering ability. If you were a Motion user then the 2600XT was better. If you were a Maya user then the 8800GT was better. When the 8800GT came out the GLSL linker was broken in Forceware which is why performance was awesome in 3D apps and poor in Motion (and like 2 games...2nd life and Wurm). It was fixed at some point in Forceware and likely that's what ended up in the Leopard "graphics patch".

Traditionally nVidia supported OpenGL better than ATI and I've always favored the Forceware over Catalyst (especially during that Catalyst 7.12 fiasco). Amusingly, ATI just borked GLSL support in Catalyst 9 after nVidia got their OpenGL 3.0/GLSL 1.30 house in order. There's a good bit of complaining about how bad the Catalyst 9.1/9.2 drivers are for OpenGL.

But this is once again the "the only pro apps that count are the ones I use" syndrome. nVidia sucks at all Pro apps because it works poorly for Motion and Core Image based pro apps. Mmmmkay.

On the plus side, it does show that the buy cheaper and upgrade more often strategy is superior in this scenario.
post #233 of 247
So am I correct in seeing that all the base model Macs (iMac, mini, MacBook) now use the exact same NVIDIA GeForce 9400M chip with memory shared --well, allocated really-- from main RAM? In an amount of 256 MB, excepting for the ham-strung 1GB base mini, which shifts to 256 MB graphic RAM allocation as soon as you put in a second GB of main RAM.

That (to me) really points out that the base mini model configuration is intentionally 'dumbed' down with only 1 GB RAM.

As my ancient (and no longer working ) iMac G5 (ALS/2005) had a Radeon 9600 with dedicated 128 MB video RAM, I am wondering how much better (if any) the GeForce 9400M shared is.
I am trying to figure out what my best option for a Mac replacement is now, given the recent minor speed revisions in the Early 2009 releases. (Yes, the new mini has big graphic improvement over old mini,,, but I am comparing to how it would seem to me visavis my iMac G5.)

Are there any sites that do generalized Mac Video chip comparatives?
The Universe is Intelligent and Friendly
Reply
The Universe is Intelligent and Friendly
Reply
post #234 of 247
Quick run through on my tests:

Cinebench OpenGL test
MacBook 2ghz 9400M
~4000
iMac 20" 2.4ghz 2400XT
~4900

Xbench
MacBook 2ghz 9400M
Quartz: 157 OpenGL: 138
iMac 20" 2.4ghz 2400XT
Quartz: 170 OpenGL: 180
post #235 of 247
Quote:
Originally Posted by Bruce Young View Post

So am I correct in seeing that all the base model Macs (iMac, mini, MacBook) now use the exact same NVIDIA GeForce 9400M chip with memory shared --well, allocated really-- from main RAM? In an amount of 256 MB, excepting for the ham-strung 1GB base mini, which shifts to 256 MB graphic RAM allocation as soon as you put in a second GB of main RAM.

That (to me) really points out that the base mini model configuration is intentionally 'dumbed' down with only 1 GB RAM.

As my ancient (and no longer working ) iMac G5 (ALS/2005) had a Radeon 9600 with dedicated 128 MB video RAM, I am wondering how much better (if any) the GeForce 9400M shared is.
I am trying to figure out what my best option for a Mac replacement is now, given the recent minor speed revisions in the Early 2009 releases. (Yes, the new mini has big graphic improvement over old mini,,, but I am comparing to how it would seem to me visavis my iMac G5.)

Are there any sites that do generalized Mac Video chip comparatives?

Whenever there is 1GB 9400M will use 128MB for VRAM. 2GB or higher~ it will use 256MB VRAM.

The 9400M is definitely slower than the 2400XT.
My tests:

Cinebench OpenGL
MacBook 2ghz 9400M
~4000
iMac 20" 2.4ghz 2400XT
~4900

Xbench
MacBook 2ghz 9400M
Quartz: 157 OpenGL: 138
iMac 20" 2.4ghz 2400XT
Quartz: 170 OpenGL: 180

The iMac Alu 20" 2400XT or anything 9400M will probably be double the speed of your ATI 9600.

Go for the previous generation iMac Alu (iMac 20" 2400XT 128MB VRAM). You can play most modern games and medium settings at 1680x1050 (without antialiasing) ~ overall reasonable application performance (iLife, iWork, Mac OS X Leopard, perhaps Aperture. Maybe even Final Cut Studio2 if you are starting out trying it out).
post #236 of 247
Quote:
Originally Posted by nvidia2008 View Post


Xbench
MacBook 2ghz 9400M
Quartz: 157 OpenGL: 138
iMac 20" 2.4ghz 2400XT
Quartz: 170 OpenGL: 180

My XBench scores on the 9400M were dismal. I don't recall what they were but they were lower than the GMA X3100 scores I saw posted.
post #237 of 247
Quote:
Originally Posted by nvidia2008 View Post

Whenever there is 1GB 9400M will use 128MB for VRAM. 2GB or higher~ it will use 256MB VRAM.

The 9400M is definitely slower than the 2400XT.

I wouldn't say it is slower based on benchmarks like Xbench. Actual gameplay tests are what matters. Sometimes- and I'm not saying that will happen here- they tell a different story.
post #238 of 247
Quote:
Originally Posted by FuturePastNow View Post

I wouldn't say it is slower based on benchmarks like Xbench. Actual gameplay tests are what matters. Sometimes- and I'm not saying that will happen here- they tell a different story.

Agreed. Ideally I would have liked to bench games in XP like UT3, Left4Dead, etc. and bench in Mac games like C&C3 and NFS:Carbon, etc.

I'm fairly confident (maybe call it a gut feeling) the 2400XT in the iMac will still perform better (15%?) then the 9400M. Being a discrete card with dedicated VRAM would give it the edge. Particularly once you start talking about DirectX9.0c ... Again, gut feeling here.

For me the point is moot. Based on Marvin's tests on the Mac Mini, the 9400M is impressive as an integrated chip but if I go "back" to PC gaming (from not really gaming nowadays) ~ it is just not an option.

Maybe I'm just trying to find excuses not to blow $1,000+ on any new Macs!
post #239 of 247
Quote:
Originally Posted by FuturePastNow View Post

The Mobility Radeon 4850, which the iMac almost certainly uses, has the same 800 stream processors as the desktop version. It's just clocked lower, run off a lower voltage.

You're right that the 8800GS in the last iMac had no relation to the desktop 8800GS, though. Nvidia plays that game a lot with their mobile GPUs. That should not be the case with the 4850.

Interestingly on Apple Spec pages it is *not* called "Mobility". Just Radeon 4850. ATI playing Nvidia-style naming trickery? Or Apple?

GT120 (rebranded 9500GT) and GT130 are listed as desktop cards. I wonder how crippled/underclocked they are (if any?) in the iMacs.
post #240 of 247
Quote:
Originally Posted by vinea View Post

1st gen of the Mac Pro that's not true.

http://www.kenstone.net/fcp_homepage...s_mac_pro.html

Core Image is not 3D. The 8800 killed the 2600XT in 3D. Core Image uses the shader language for pixel level effects. This isn't the same as 3D rendering ability. If you were a Motion user then the 2600XT was better. If you were a Maya user then the 8800GT was better. When the 8800GT came out the GLSL linker was broken in Forceware which is why performance was awesome in 3D apps and poor in Motion (and like 2 games...2nd life and Wurm). It was fixed at some point in Forceware and likely that's what ended up in the Leopard "graphics patch".

Traditionally nVidia supported OpenGL better than ATI and I've always favored the Forceware over Catalyst (especially during that Catalyst 7.12 fiasco). Amusingly, ATI just borked GLSL support in Catalyst 9 after nVidia got their OpenGL 3.0/GLSL 1.30 house in order. There's a good bit of complaining about how bad the Catalyst 9.1/9.2 drivers are for OpenGL.

But this is once again the "the only pro apps that count are the ones I use" syndrome. nVidia sucks at all Pro apps because it works poorly for Motion and Core Image based pro apps. Mmmmkay.

On the plus side, it does show that the buy cheaper and upgrade more often strategy is superior in this scenario.

It works faster in a number of apps besides Motion. From my own experience, it's faster in Archicad as well.

Since many pro apps use Core Image, yes, poor performance is important. As many 3D pro apps use Core Image, its important for them.

Even after the patch, it was still no better. Worse in some tests still. Not a good buy for pro users.

We have one set of tests here:

http://www.barefeats.com/harper10.html

The later ones here.

http://www.barefeats.com/imp04.html

As Barefeats said, it would be better to buy the 3870 than the 8800GT for faster performance, and the 2600 Pro would still be equal.

A more modern ATI card such as the 4870 still is much better than the currently best Nvidia for this purpose. Far better than a mid Nvidia.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Current Mac Hardware
AppleInsider › Forums › Mac Hardware › Current Mac Hardware › Benchmarks of 2009 iMacs, Mac minis show negligible speed-ups