Will Apple turn to a AMD-style rating scheme?

Posted:
in Future Apple Hardware edited January 2014
In response to Intel widening their pipelines to win the processor marketing game, AMD a little while ago introduced their own marketing spin, the "P-Rating", in which they brand an AMD chip based on a comparison with its competing Pentium.



There's no doubt Apple is keeping an eye on this.



With the new G4 (i.e. once the G4 can actually compete with the latest x86 chips), can anyone see Apple turning to such a scheme? (And has anyone found AMD's 'P-Rating' particularly convincing?)



Or will Apple continue hiding under the fuzziness of the "Megahertz Myth"?



Obviously, a lot of depends on just how good the next PowerMac is, but I'm curious if Apple-style P-rating is something that people would like or not...
«1

Comments

  • Reply 1 of 29
    hmurchisonhmurchison Posts: 12,291member
    They won't need to. The G5 is gonna be a monster!
  • Reply 2 of 29
    eugeneeugene Posts: 8,254member
    AMD can't even keep up with it's ratings so why bother? PPC performance is more varied anyway. It just gets more confusing when you have different fake numbers for 7450s, 7400s, 750FXs, etc.



    Notice how the Duron line doesn't have performance ratings?



    The main issue is that Intel is pushing ahead anyway. AMD with its fake numbers can't even catch up to Intel's real numbers.
  • Reply 3 of 29
    big macbig mac Posts: 480member
    Performance should always speak for itself - if Apple has it, then that's what should be used for marketing purposes first and foremost. However, with that said, perhaps Apple should start pushing an industry standard benchmark in place of MHz. While benchmarks are by no means perfect, they certainly are better than letting naive customers surmise that "bigger is better," automatically favoring Intel chips. Anything's better than that, even SPEC. Now if we could only get a desktop version of the Power4 in our machines to be our G5, we would really be in business!



    [ 06-10-2002: Message edited by: Big Mac ]</p>
  • Reply 4 of 29
    Why not ditch the mhz to only tell the gigaflops?
  • Reply 5 of 29
    programmerprogrammer Posts: 3,409member
    [quote]Originally posted by cl.nevil:

    <strong>Why not ditch the mhz to only tell the gigaflops?</strong><hr></blockquote>



    Gigaflops don't tell the whole story either. Performance is very heavily dependent on the task at hand, and the software implementation. SPECmarks attempt to measure some representative tasks, but only with very limited success and they can't reflect specialized implementations (AltiVec, SSE, etc).
  • Reply 6 of 29
    big macbig mac Posts: 480member
    It would be nice if the industry could agree on a real world application, platform dependent test suite. Using benchmark conditions very similar to those that real world users experience seems like the fairest approach to me. And with such a scheme anyone with the money to buy the hardware and apps in question could replicate the test results.
  • Reply 7 of 29
    kupan787kupan787 Posts: 586member
    [quote]Originally posted by Big Mac:

    <strong>It would be nice if the industry could agree on a real world application, platform dependent test suite. Using benchmark conditions very similar to those that real world users experience seems like the fairest approach to me. And with such a scheme anyone with the money to buy the hardware and apps in question could replicate the test results.</strong><hr></blockquote>



    But whos to say what is real world? Real world tasks to one person may be email, office apps, and

    web surfing. To another guy it may be all about compile times, and how fast Quicktime can convert to MP4. And yet another guy only uses his machine for rendering in After Effects.



    My point is there is no one "real world" benchmark someone can run since everyone lives in a different world.



    BTW, what exactly is "real world" supposed to mean? You hear it tossed around a lot, but no concrete meaning is behind it.



    Ben
  • Reply 8 of 29
    mac voyermac voyer Posts: 1,283member
    Apple can't use gigaflops because they would have to show the gigaflop rating of the PC. They can't use an AMD style approach because they could still only claim something like 1400+. They couldn't use benchmarks because those never work out in Apple's favor. And real-world tests also show Apple to be behind. Even if they temporarily get ahead of the PC world, it won't last and they would right back where they started from. The reason Apple can't promote an objective performance test is because they know they would lose. It is no more complicated than that.



    Their current strategy is to try to find enough disgruntled PC users to make a case for the Mac platform as a whole. I is never a good sigh when a company or a politician has to resort to negative campaigning. Apple is in full mud slinging mode. If MS decided to exploit disgruntled Mac users (of which there are more than a few), then it could get truly ugly for Apple.
  • Reply 9 of 29
    wrong robotwrong robot Posts: 3,907member
    [quote]Originally posted by Hobbes:

    <strong>In response to Intel widening their pipelines to win the processor marketing game, AMD a little while ago introduced their own marketing spin, the "P-Rating", in which they brand an AMD chip based on a comparison with its competing Pentium.



    There's no doubt Apple is keeping an eye on this.



    With the new G4 (i.e. once the G4 can actually compete with the latest x86 chips), can anyone see Apple turning to such a scheme? (And has anyone found AMD's 'P-Rating' particularly convincing?)



    Or will Apple continue hiding under the fuzziness of the "Megahertz Myth"?



    Obviously, a lot of depends on just how good the next PowerMac is, but I'm curious if Apple-style P-rating is something that people would like or not...</strong><hr></blockquote>



    well apple is definatly showing an interest in appealing to the bulk consumer base(or at least they are trying to)

    didn't apple already do this way back in the original intro of the g3?

    before the g3 there were various processors...this and that. then once the pentium 2 was at the end and the p3 was coming(that is if my memory serves me correctly)apple released the g3. they could have called it the 750(or whatever the g3 is) but they chose g3....yes because its the 3rd generation of apple processors, but also, I'm sure, to compete with pentiums on the dumb consumer level.



    what I find funny is intels mobile pentium 4-m, as if M stood for something besides mobile.
  • Reply 10 of 29
    I think the whole mhz thing is like horsepower rating in cars.



    They say car X has 190hp. Car Y has 150hp.



    Car Y has most of that horsepower earlier in the acceleration curve. Car Y can beat car X while accelerating.



    Top speed/power is NOT the most important number. It's how it's used.



    Same with computers. I can have a 1ghz Pentium with a 5400rpm hardrive and 4x CD rom and low mhz motherboard. An 800mhz Pentium with a 10000rpm SCSI drive and 50x CD rom with a top of the line motherboard.



    The 800mhz computer will feel faster because more of that speed can be used.



    or something...
  • Reply 11 of 29
    hmurchisonhmurchison Posts: 12,291member
    [quote]Originally posted by digital_llama:

    <strong>I think the whole mhz thing is like horsepower rating in cars.



    They say car X has 190hp. Car Y has 150hp.



    Car Y has most of that horsepower earlier in the acceleration curve. Car Y can beat car X while accelerating.



    Top speed/power is NOT the most important number. It's how it's used.



    Same with computers. I can have a 1ghz Pentium with a 5400rpm hardrive and 4x CD rom and low mhz motherboard. An 800mhz Pentium with a 10000rpm SCSI drive and 50x CD rom with a top of the line motherboard.



    The 800mhz computer will feel faster because more of that speed can be used.



    or something...</strong><hr></blockquote>



    I like the car analogy because it sucks that Honda VTEC engines have to go up to 7-8rpm to generate full horsepower. Give me peak horsepower at 5500-6000 RPM and i'm a happy leadfoot.



    Intel gambled on a design with high frequency/low IPC and has done a good job. AMD is going to have to kick butt with the Opteron.
  • Reply 12 of 29
    powerdocpowerdoc Posts: 8,123member
    The AMD rating is a comparison between the old generation of Athlon and the new one. It means that an athlon XP 1800 is as fast as an old hypothetical athlon 1800.

    It does not mean that it has the same speed than the Pentium 4 1800 : in fact the athlon XP is better.
  • Reply 13 of 29
    kidredkidred Posts: 2,402member
    [quote]Originally posted by powerdoc:

    <strong>The AMD rating is a comparison between the old generation of Athlon and the new one. It means that an athlon XP 1800 is as fast as an old hypothetical athlon 1800.

    It does not mean that it has the same speed than the Pentium 4 1800 : in fact the athlon XP is better.</strong><hr></blockquote>



    Wow, I heard the exact opposite. AMD was fighting the same mhz war that Apple is and so AMD decided to name their chips at what it is compared to. So the XP 1800 is only 1.6ghz but is equal to a Pllll 1.8ghz.
  • Reply 14 of 29
    hmurchisonhmurchison Posts: 12,291member
    Kid Red is correct. Athlon XP processors are rated in Pentium 4 equivalency ratings. Why else would they move to this?
  • Reply 15 of 29
    marcukmarcuk Posts: 4,442member
    I believe powerdoc is right. Alot of the PC web sites stated quite clearly that an XP model number is the equivalent of the old Athlon Thunderbird perforance. And they also said that most people misunderstood it to be a comparison between the P4 clock speed.



    So much disinformation....
  • Reply 16 of 29
    matsumatsu Posts: 6,558member
    AMD's chips are no longer significantly better than P4 as far as instructions per clock are concerned.



    Take a look at <a href="http://www.anandtech.com/cpu/showdoc.html?i=1635&p=1"; target="_blank">this article</a>



    What you find is that even the latest Athlon XP2200+ gets beaten (slimly but consistently) by a 2Ghz P4. Some of this comes down to more Intel friendly benchmarks and some of it to wider adoption of SSE2 (which actually makes a difference). But you can look to timed tests from a variety of sources -- P4's absolutely destroy Athlons at media encoding tasks. Nothing even comes close to the 2.4 and 2.53Ghz P4's, and even the 1.8Ghz P4's run neck and neck with the fastest Athlons. Intel has worked that core and the IPC/megahurts myth simply doesn't hold up (at least on the x86 side)



    It might hold up a bit in the G4/P4 comparo, under SIMD related tests, but they have so many cycles doing (contrary to anti-Intel myth) a lot of work per clock that nobody comes close to their performance.



    A while ago Fuji was sued over false advertising claims about the true resolution of their SuperCCD sensor. I wonder if Intel couldn't now sue AMD for false claims about Athlon performance. <img src="graemlins/bugeye.gif" border="0" alt="[Skeptical]" />
  • Reply 17 of 29
    hmurchisonhmurchison Posts: 12,291member
    [quote]Originally posted by MarcUK:

    <strong>I believe powerdoc is right. Alot of the PC web sites stated quite clearly that an XP model number is the equivalent of the old Athlon Thunderbird perforance. And they also said that most people misunderstood it to be a comparison between the P4 clock speed.



    So much disinformation....</strong><hr></blockquote>



    That doesn't make sense. The difference between the Older Athlons and the XP's are a slightly different core. This would not lead to discrepancies in the rating system.



    [quote] What do the 2200+, 2100+, 2000+, 1900+, 1800+ and 1700+ numbers mean?

    A13 These are model numbers. AMD identifies the AMD Athlon XP processor using model numbers, as opposed to megahertz, such as the 2200+, 2100+, 2000+, 1900+, 1800+ and 1700+ versions. Model numbers are designed to communicate the relative application performance among the various AMD Athlon XP processors. The AMD Athlon XP processor 2200+ can outperform an Intel Pentium® 4 processor operating at 2.2GHz on a broad array of end-user applications.

    AMD Athlon XP processor 2200+ operates at a frequency of 1.8GHz.

    AMD Athlon XP processor 2100+ operates at a frequency of 1.73GHz.

    AMD Athlon XP processor 2000+ operates at a frequency of 1.67GHz.

    AMD Athlon XP processor 1900+ operates at a frequency of 1.6GHz.

    AMD Athlon XP processor 1800+ operates at a frequency of 1.53GHz.

    AMD Athlon XP processor 1700+ operates at a frequency of 1.47GHz.

    <hr></blockquote>



    Taken from <a href="http://athlonxp.amd.com/faq/"; target="_blank">AMD FAQ</a>
  • Reply 18 of 29
    g-newsg-news Posts: 1,107member
    Some of you may remember the Cyrix chips back in the mid 90s...these chips had PR ratings as well, also based on the Pentium 1 and MMX back then...thing is, the didn't keep up to the real thing either, and the Cyrix failed, and its recent revival by VIA has also failed more or less.



    So it's actually a surprise to me that AMD is getting away with their scheme, probably because they handle it a bit more openly and more honestly than Cyrix back then.



    And I think it'd be a very bad idea for Apple to suddenly call their PowerMacs "PowerMac G4 2500", when in reality, the dual 1.2GHz chips inside could barely keep up with a 1.8GHz P4...



    Apple has burnt itself badly with the claims of supercomputer already among PC users, going on with the bragging thing would only hurt them further.



    They should keep up the humble way they've been going since January. it's better to beat the competition humbly, rather than braggingly.



    G-News
  • Reply 19 of 29
    eugeneeugene Posts: 8,254member
    They won't get away with it as long as Intel is sprinting ahead with faster and faster chips.



    At this rate, AMD will have a 2500+ (2 GHz) by 2003 and Intel will be at 3 GHz.



    AMD will roll out the Hammer CPUs in the mean time and they're even slower clocked than that...
  • Reply 20 of 29
    big macbig mac Posts: 480member
    [quote]Originally posted by kupan787:

    <strong>



    But whos to say what is real world? Real world tasks to one person may be email, office apps, and

    web surfing. To another guy it may be all about compile times, and how fast Quicktime can convert to MP4. And yet another guy only uses his machine for rendering in After Effects.



    My point is there is no one "real world" benchmark someone can run since everyone lives in a different world.



    BTW, what exactly is "real world" supposed to mean? You hear it tossed around a lot, but no concrete meaning is behind it.



    Ben</strong><hr></blockquote>



    I know that the term real world has varied connotations for different people. However, we all use similar applications and do similar tasks with those applications. It's easy to think of such a suite as a seriously beefed up version of Apple's celebrated Photoshop bake-offs. Profile ten of the major cross platform applications many users use day in and day out. Run the same application tasks on both sets of machines and record the results. At the end of the tests, average out the results from each platform for a single speed mark. It's not rocket science, and it would level the playing field. People would more easily understand that such testing represents true speed differences for the applications people actually work with.



    [ 06-11-2002: Message edited by: Big Mac ]</p>
Sign In or Register to comment.