Mhz/Ghz myth... why doesn't apple do this?
I recently was reading some stuff about AMD's Athlon XP 3000+ chip... by the time i got to the " this chip runs at around 2Ghz.." i had to reread it. I know that the actual Ghz of the chip doesn't have a whole lot since this bad boy can compete with Intel's 3.06Ghz chip. and being a mac guy, i should know better. still, i had to read that line two or three times
so why hasn't apple done this? say, like, PowerMac2000+, as in this chip runs at 1ghz but will out perform competitor's 2ghz computer?
what do you think?
so why hasn't apple done this? say, like, PowerMac2000+, as in this chip runs at 1ghz but will out perform competitor's 2ghz computer?
what do you think?
Comments
<strong>so why hasn't apple done this? say, like, PowerMac2000+, as in this chip runs at 1ghz but will out perform competitor's 2ghz computer?
what do you think?</strong><hr></blockquote>
Because they'd get crushed right now?
If you pick things that really benefit from AltiVec, you could put up a good fight. If you pick double precision floating point, the G4 gets crushed. If you pick int, you also get blown past.
So the "performance" number would be lies, damn lies, or statistics depending on how you did it. It has to pass the laugh test - the counter-benchmarks will be non-altivec, unable to use a second processor -> extremely unfavorable.
Now, if you _do_ have a solid performer (like the PowerPC 970 from IBM seems to be), time to roll out the demos... But if you start putting 'performance numbers' on your chips, they sort of need to stick with them. Think how silly AMD would seem if they had "Athalon 4200+", while Intel sold Pentiums @ 7.6GHz. And there's nothing saying that PPC chip development will suddenly outpace Intel.
<strong>. What's worse is how the new Athlon XP 3000+ is actually a 2167 MHz chip w/512K L2 cache while the old 2800+ is a 2250 MHz chip w/256 L2 cache. Now guess which one's faster for most tasks...yup, the chip with the lower Quantispeed rating. AMD's treading on dangerous ground. I hope they trip, fall and die on it...</strong><hr></blockquote>
so the Thoroughbred core's 2800+ (@ 2.25Ghz) is faster on most tasks than the Barton core's 3000+ (@ 2.16Ghz)?
<strong>
so the Thoroughbred core's 2800+ (@ 2.25Ghz) is faster on most tasks than the Barton core's 3000+ (@ 2.16Ghz)?</strong><hr></blockquote>
Pretty much...AMD's basically trying to tell you an extra 256K L2 cache is equal to 300 magical/fake MHz.
Thoroughbred 2700+ = 2167 MHz
Barton 3000+ = 2167 MHz
The only difference is the amount of L2 cache.
1,42 Ghz times two equals 2.84
Two times 2 mb cache. Thats 8*256 kb equals 8*300 minus 300 (since the low rated AMD already had 256 kb cache) equals 2.1
Total 2.84 plus 2.1 equals 4.94. Or to be more "fair": two times 2.47. Kein Hexerei nur behändigkeit and not including the advantage of the G4 over AMD/Intel per Ghz.
Who needs 970
<strong>
Who needs 970 </strong><hr></blockquote>
that's what i'm saying
j/k
<strong>The funniest part about AMD's Quantispeed is that it's not defeating the MHz Myth at all. The only thin those inflated numbers are doing is justifying MHz or "fake MHz" as a speed rating. What's worse is how the new Athlon XP 3000+ is actually a 2167 MHz chip w/512K L2 cache while the old 2800+ is a 2250 MHz chip w/256 L2 cache. Now guess which one's faster for most tasks...yup, the chip with the lower Quantispeed rating. AMD's treading on dangerous ground. I hope they trip, fall and die on it...</strong><hr></blockquote>
your crazy, i like AMD, if i was building my own comp u can be assured i would be using that instead of an intel
Even for AMD/intel this scoring is hard as difference in FPU vs integer calcualtions gives diffrent ratios, buss implementation affect some things but not all and so on.
For Apple that have totaly different CPUs, totally different bus design and totaly different operative system the scatter will be much higher.
The proof of the pudding is eating it not naming it
If Apple would introduce the "Apple 4.7 MHz CPU" and that 4.7 MHz CPU would beat the crap out of the Intel P4 3 GHz allmost every application from Acrobat, SETI, Excel to UT 2003 every body would know it and Apple would brag about it regardless of the clock speed. Reving a 2 Liter 4 cylinder engine up to 6000 rpm does not make if more powerful than a 8 liter V12 at 3000 rpm. The problem with computers is that while "rpm" is easy to define "speed" is far more elusive.
In the future when we can compare a 1.4 Ghz G4 with a 1.4 GHz 970 I hope that the 970 will be way better over all but even with the same OS and other hardware the difference intrinsic to the CPUs and bus designs will give the 970 very different speed advantages in different tasks.
Remember the 604E? I think tha claim was that a G3 MHz was about equal to 1.5 MHz in a 604E. But for the FPU caluclations the 604E scored better than the G3 MHz for MHz. To make things even more complicated the old 604 ( not the E with 512 K L2) the performance was stronly affected by L2 size so a233 MHz 604 with a 1 MB L2 was probably faster than both a 233 MHz 604E and a G3 at 233 MHz. <img src="graemlins/surprised.gif" border="0" alt="[surprised]" /> and this is nothing compared to getting into comparing Mac and PCs.
For Apple to market a "4G" computer running 2x1.8 GHz would be pathetic.
the really classy way would be to do a brittish understatment calling it something 1.8 dual and then say " we are rather satisfied with its performance, you might be that as well" Perhaps show some tests were they beat the crap out of everything and then simply suggest that the users should try out their own favorite application...
I do not know how alien it would for the american market to act humble from a superior position, but I am redy for a change. If I see an other keynote were SJ demo photoshop filters I will turnt it of, I have had it with his RDF were reality get FUBAR
So, a 3000+ AMD /= 3.06 GHZ Intel, 3000+ AMD = 3 GHZ Athlon (I believe it's the Athlon.)
[quote]Originally posted by Wagnerite:
<strong>I recently was reading some stuff about AMD's Athlon XP 3000+ chip... by the time i got to the " this chip runs at around 2Ghz.." i had to reread it. I know that the actual Ghz of the chip doesn't have a whole lot since this bad boy can compete with Intel's 3.06Ghz chip. and being a mac guy, i should know better. still, i had to read that line two or three times
so why hasn't apple done this? say, like, PowerMac2000+, as in this chip runs at 1ghz but will out perform competitor's 2ghz computer?
what do you think?</strong><hr></blockquote>
<strong>AMD's treading on dangerous ground. I hope they trip, fall and die on it...</strong><hr></blockquote>
Yes, they are, but I hope they succeed.
The only thing they did wrong in recent times is prod and poke an 800-lb gorilla into a marketing and engineering frenzy. They couldn't finish what they started. They beat Intel to the GHz punch, but couldn't sustain it. Now they have to do Marketing with Voodoo Numbers.
But if they wilt and die, Intel can go back to milking the industry without pushing their designs. This might let IBM catch them (let's leave Mot out of this conversation), which looks good for Apple users. Our hardware could start to catch up.
However, I like the competition. That's why American cars got better in the 1980s— they had to. AMD pushing Intel, which leads to Apple pushing IBM/Mot is A Good Thing, in my eyes.
Besides, my motherboard uses AMD.
At least they won't have to explain the huge Mhz gap.
Dobby.
<strong>AMD's 'Quantaspeed' rating, or whatever it's called, does NOT refer to an equivalent Intel chip. It refers to previous versions of an AMD chip.
So, a 3000+ AMD /= 3.06 GHZ Intel, 3000+ AMD = 3 GHZ Athlon (I believe it's the Athlon.)</strong><hr></blockquote>
This is an outright lie, since even the old Thunderbird chips were even faster per clock than the Athlon XPs in many tasks. AMD like Intel, lengthened its pipeline with the move to Athlon XP and took a performance hit for it.
Nice to see you're not falling for AMD's propaganda.
<strong>
your crazy, i like AMD, if i was building my own comp u can be assured i would be using that instead of an intel</strong><hr></blockquote>
I'm not crazy. I built my PC with an Intel chip for a reason. From 2.4 GHz up, AMD can't match Intel in price/performance. What you'll see is the Athlon keeping pace in a few tests, but falling behind miserably in others, such as those that are SSE2 optimized.
The OEMs seem to agree since no major OEM other than HP is shipping Athlon systems.
<strong>
The OEMs seem to agree since no major OEM other than HP is shipping Athlon systems.</strong><hr></blockquote>
AMD's had other problems besides the late performance gap, such as relatively poor reliability. They barely had any OEMs buying their CPUs when they were ahead, speed-wise. I note that Dell is only looking at them seriously now because they're actually offering something that Intel isn't (x86-64).
One problem with making your product "just like" the other guy's product is that you'll be much harder pressed to come up with a reason why anyone should buy your product.
Back on thread topic, I am utterly against Apple using anything other than the native characteristics of whatever processor they decide to use. In fact, rather than hoping that Apple introduces more marketing-speak into the discussion, I'm hoping that the 970 and its supporting architecture will speak well enough for itself that Apple can put the (absurdly overblown) marketing rhetoric they already employ to pasture. As it is, I can't read Apple's product pages without wondering about the copywriters' sanity.
<strong>
I note that Dell is only looking at them seriously now because they're actually offering something that Intel isn't (x86-64).</strong><hr></blockquote>
As long as Intel stays on schedule, I doubt they have much to worry about. 5 GHz next year...10 GHz by 2006. 32-bit desktop computing wil be here for a while. I don't think 64-bit computing will take word procesing, gaming, web browsing, e-mail, etc. to the next level.
<strong>
This is an outright lie...</strong><hr></blockquote>
I can't tell if you're calling me a liar or AMD....
<strong>
I can't tell if you're calling me a liar or AMD....</strong><hr></blockquote>
Since your post did not question its validity, it doesn't matter. You lied. AMD lied. Don't play stupid now...It doesn't matter if you knew whether it was a lie or not.
[ 02-19-2003: Message edited by: Eugene ]</p>