Apple's claims about M1 Mac speed 'shocking,' but 'extremely plausible'

Posted:
in General Discussion edited November 2020
Comparing Apple's claims for the Apple Silicon M1 chip to what it has already achieved in the A-series, Anandtech says it "believes" the quoted speed results.

Apple's new M1 processor
Apple's new M1 processor


During Apple's November 10 event launching new Macs with Apple Silicon M1 processors, the company made very many claims about the chip's performance and low-power capabilities. As is AppleInsider, tech deep-dive site Anandtech is waiting to test the new machines, but concludes that it "certainly believes" Apple's claims.

"The new Apple Silicon is both shocking, but also very much expected," says Anandtech in a detailed technical review of what information is available. "Intel has stagnated itself out of the market, and has lost a major customer today. AMD has shown lots of progress lately, however it'll be incredibly hard to catch up to Apple's power efficiency."

"If Apple's performance trajectory continues at this pace," it continues, "the x86 performance crown might never be regained."

Full chip performance, Intel vs Apple.

M1 should be up a bit higher
Come on @nuvia_inc, pin the tail on the charthttps://t.co/TuFycIQi1Y pic.twitter.com/2LNOFneZEc

-- . (@IanCutress)


This conclusion is also based on direct testing of Apple's A-series processors. "What we do know is that in the mobile space, Apple is absolutely leading the pack in terms of performance and power efficiency," it says.

Anandtech does note, however, that since it last tested the Apple A12Z, we've seen more significant jumps from both AMD and Intel." It is also critical of Apple's choice of statistics and data shown in the presentation.

"Apple's comparison of random performance points is to be criticized," it says. "However the 10W measurement point where Apple claims 2.5x the performance does make some sense, as this is the nominal TDP [Thermal Design Power] of the chips used in the Intel-based MacBook Air."

Apple's M1 is the first of its Apple Silicon processors, and marks the beginning of the company's move away from using Intel in its Macs. Apple has started its transition to exclusively using its own processors by bringing out a new M1-based Mac mini, MacBook Air, and 13-inch MacBook Pro.
«1345

Comments

  • Reply 1 of 84
    The future of the Ipad pro is bleak.  Prices are higher.  And they still use the Ios base ....
    williamlondon
  • Reply 2 of 84
    Don’t overlook the unified memory architecture that Apple can deploy, (as they own the whole stack) this will save 2x on a lot of common functions! 
    tmaywilliamlondonmagman1979lolliverBeatscat52watto_cobra
  • Reply 3 of 84
    red oakred oak Posts: 1,088member
    Fun seeing washed up “ consultants” out there trying to push back the sea of Apple Silicon performance that is going to wash over x86
    rezwitsmagman1979lolliverBeatscat52watto_cobra
  • Reply 4 of 84
    Pjs said:
    The future of the Ipad pro is bleak.  Prices are higher.  And they still use the Ios base ....
    Nah.  The A14X coming in next years iPad Pro will be killer for a tablet. You can’t use the MacBook Air or Pro as a tablet. If Apple can figure out how to slim down the Magic Keyboard they can also beat the Air on weight.

    I’m debating getting a M1 MacBook Pro but during corona virus, I’m pretty much living on my iPad Pro. My 2018 13” MacBook Pro has been barely turned on.
    magman1979Beatswatto_cobra
  • Reply 5 of 84
    Pjs said:
    The future of the Ipad pro is bleak.  Prices are higher.  And they still use the Ios base ....
    Thank goodness iPad Pros "still" use iOS (or iPadOS, to be more specific). That's precisely why people like me choose the platform. 
    williamlondonrandominternetpersonaderutterDogpersonmagman1979jdb8167georgie01equality72521Beatswatto_cobra
  • Reply 6 of 84
    lkrupplkrupp Posts: 10,557member
    The spec monkeys are waiting for proof in benchmark tests but are those tests even available yet? Are there tests that will run on the M1 and Big Sur and how will they compare to established X86 tests? Will they be fair comparisons?
    williamlondonmagman1979Beatscornchipwatto_cobra
  • Reply 7 of 84
    I'm dubious of their claims. But i'm all for it if its true. I'll probably wait for M2 tho as I dont need anything right now.

    One thought I had....

    if these chips are so amazing.... why are they not making servers? using them in data centres, thats where power per watt really matters isnt it?
    williamlondonmuthuk_vanalingamwatto_cobra
  • Reply 8 of 84
    GG1GG1 Posts: 483member
    The graph below in the AnandTech article made it clear to me why Apple released their own M1 SoC when they did. Sure, it's only one benchmark, but the trend is unmistakable.

    Edit for grammar.
    edited November 2020 kkqd1337Rayz2016rezwitsmagman1979lolliverjdb8167Beatswatto_cobra
  • Reply 9 of 84
    flydogflydog Posts: 1,123member
    kkqd1337 said:
    I'm dubious of their claims. But i'm all for it if its true. I'll probably wait for M2 tho as I dont need anything right now.

    One thought I had....

    if these chips are so amazing.... why are they not making servers? using them in data centres, thats where power per watt really matters isnt it?
    You're dubious of Apple's claims?  So your position until you see evidence is that Apple is committing fraud by misrepresenting the capability of its hardware. 
    tmayRayz2016williamlondonrezwitsmagman1979Samsonikklolliverjdb8167anonconformistBeats
  • Reply 10 of 84
    flydog said:
    kkqd1337 said:
    I'm dubious of their claims. But i'm all for it if its true. I'll probably wait for M2 tho as I dont need anything right now.

    One thought I had....

    if these chips are so amazing.... why are they not making servers? using them in data centres, thats where power per watt really matters isnt it?
    You're dubious of Apple's claims?  So your position until you see evidence is that Apple is committing fraud by misrepresenting the capability of its hardware. 
    yea. you hit the nail on the head.

    whats the best selling windows computer its meant to be so much better than? some $200 education machine.
    williamlondon
  • Reply 11 of 84
    flydogflydog Posts: 1,123member

    Pjs said:
    The future of the Ipad pro is bleak.  Prices are higher.  And they still use the Ios base ....
    Wrong.  The original iPad was $499 (nearly $600 in today's dollars), compared to $329 today.  That's almost half the price for a device that is vastly more capable.


    kkqd1337williamlondonrandominternetpersondocno42Dogpersonmagman1979lolliverBeatscat52watto_cobra
  • Reply 12 of 84
    JinTechJinTech Posts: 1,022member
    And yet we still don't know the clock speed of these processors. I know it doesn't matter but I am so curious.
    kkqd1337cat52
  • Reply 13 of 84
    Comparing mobile chips to the likes of 10900K...that is extremely laughable.
    williamlondon
  • Reply 14 of 84
    h4y3s said:
    Don’t overlook the unified memory architecture that Apple can deploy, (as they own the whole stack) this will save 2x on a lot of common functions! 
    Unified Memory Architecture is not a new idea. It was created by Nvidia years ago and several companies use it for their own products. It has its own set of advantages and tradeoffs. On some of the more technical blogs, folks are already debating the virtues of UMA versus segregated CPU/GPU memory as the thinking - and experience for those who have it - is that you can run into problems if the GPU exhausts memory resources that the CPU needs and vice versa. One of the reasons why Nvidia created UMA in the first place was that they are promoting the idea to data centers that you get more performance per dollar by shifting as many computations from the CPU to the GPU as possible for workloads that don't require a CPU's general purpose flexibility.

    So UMA will work for Apple Silicon for workloads that are both CPU and GPU intensive (right now this is handled by giving, say, 16 GB of RAM to the CPU and 4 GB of RAM to the GPU as is done for the 16' MacBook Pro) only if enough RAM is provided. Of course, Apple knows this. And they know that users of, say, vector video editing or 3D animation - examples of heavy peak simultaneous CPU/GPU workloads - are going to be willing to pay a lot for that RAM.
    GG1zoetmbDogpersoncat52iHy
  • Reply 15 of 84
    PjsPjs Posts: 9member
    flydog said:

    Pjs said:
    The future of the Ipad pro is bleak.  Prices are higher.  And they still use the Ios base ....
    Wrong.  The original iPad was $499 (nearly $600 in today's dollars), compared to $329 today.  That's almost half the price for a device that is vastly more capable.

    And Ipad Pro??

    williamlondonmuthuk_vanalingam
  • Reply 16 of 84
    mpantonempantone Posts: 2,040member
    cloudguy said:
    h4y3s said:
    Don’t overlook the unified memory architecture that Apple can deploy, (as they own the whole stack) this will save 2x on a lot of common functions! 
    Unified Memory Architecture is not a new idea. It was created by Nvidia years ago and several companies use it for their own products. It has its own set of advantages and tradeoffs. On some of the more technical blogs, folks are already debating the virtues of UMA versus segregated CPU/GPU memory as the thinking - and experience for those who have it - is that you can run into problems if the GPU exhausts memory resources that the CPU needs and vice versa. One of the reasons why Nvidia created UMA in the first place was that they are promoting the idea to data centers that you get more performance per dollar by shifting as many computations from the CPU to the GPU as possible for workloads that don't require a CPU's general purpose flexibility..

    Sorry, SGI implemented UMA with their O2 workstation (circa 1996), years before Nvidia’s founding. 

    You are right that it is not a new idea.
    edited November 2020 GG1tmaythtrezwitsjdb8167Beatscat52watto_cobra
  • Reply 17 of 84
    red oak said:
    Fun seeing washed up “ consultants” out there trying to push back the sea of Apple Silicon performance that is going to wash over x86
    What is "fun" is seeing the same people who spent most of the last 13 years predicting the death of Android when for much of that time it has had 60%-85% market share are claiming that Wintel is going to shrivel up and blow away when Macs only accounted for a 6% market share in 2019, and in 3Q 2020 - their best quarter in history - only had an 8% market share, not only far behind Windows but even behind 11% for ChromeOS. So even if Apple doubles their market share to 16% - Macs did reach 15% share about 5 or 6 years ago, but only because volumes were much lower back then because enterprises and many users were flat out refusing to buy Windows 8 machines - Intel will be fine. If billions of people shifting from Wintel being their primary computing devices to Android and iOS phones and tablets being their primary computing devices didn't make much of a dent in Wintel - and you can go back to the archives of this very site from like 2013 or 2014 claiming that it would with obituaries being written for the likes of Microsoft, Dell, HP etc. and much ado being made of IBM shifting to Macs and iPads - then losing Apple's market share (which it has only had since 2005 in the first place!) won't hurt them much either. Intel makes far more money in the server/data center market that Apple has never had so much as a drop in the bucket in than they do in the 15-20 million CPUs that they provide to Apple in any given year - with the bulk of those being Core i3 and Core i5 CPUs that they sell to Apple for only $15-$25 more than it costs them to make - in the first place. Seriously, you folks who believe that just because your own lives are centered around your AirPods, Apple Watches, Apple TVs, iPhones, iPads, Macs and Apple services that the world revolves around Apple really need an adjustment. In reality Apple products are only a fraction of devices sold in any category except maybe smartwatches (if you limit it to actual smartwatches as opposed to cheap fitness trackers running firmware). 

    So if you think that Intel makes more money selling CPUs to Apple than they make selling them to, say, HP just because Apple makes a ton more money than HP does then you are absolutely totally wrong. Intel charges HP the exact same amount for those same Core i3, i7, i9 and Xeon CPUs that they charge Apple. As a result, they make more money off HP because HP sells way more computers than Apple does, even if they make much less money than Apple does in the process. 

    Finally, of course Apple's 5nm M1 is going to outperform Intel's 14nm Core i5. But when Intel's Core i5 is also 14nm in about 3 years (if they hire TSMC to make the chips) or 5 years (if they make the chips themselves)? Then we will see whose performance will wash over whose. Apple will have some advantages, namely the inherent efficiency of RISC vs ISA as well as Apple's strategy of maximizing performance from each core as opposed to Intel's - and everyone else's - strategy of maximizing cores. But Intel also has a performance attribute of their own: the densest core design in the industry. Meaning that a 10nm Intel design is the equivalent of a 7nm Apple one. So when Intel does get to a 5nm design, it will be the equivalent of a 3nm Apple one. So, we shall see ... 
    flyingdp
  • Reply 18 of 84
    crowleycrowley Posts: 10,453member
    flydog said:

    Pjs said:
    The future of the Ipad pro is bleak.  Prices are higher.  And they still use the Ios base ....
    Wrong.  The original iPad was $499 (nearly $600 in today's dollars), compared to $329 today.  That's almost half the price for a device that is vastly more capable.
    iPad Pro dude.
    williamlondon
  • Reply 19 of 84
    mpantone said:
    cloudguy said:
    h4y3s said:
    Don’t overlook the unified memory architecture that Apple can deploy, (as they own the whole stack) this will save 2x on a lot of common functions! 
    Unified Memory Architecture is not a new idea. It was created by Nvidia years ago and several companies use it for their own products. It has its own set of advantages and tradeoffs. On some of the more technical blogs, folks are already debating the virtues of UMA versus segregated CPU/GPU memory as the thinking - and experience for those who have it - is that you can run into problems if the GPU exhausts memory resources that the CPU needs and vice versa. One of the reasons why Nvidia created UMA in the first place was that they are promoting the idea to data centers that you get more performance per dollar by shifting as many computations from the CPU to the GPU as possible for workloads that don't require a CPU's general purpose flexibility..

    Sorry, SGI implemented UMA with their O2 workstation (circa 1996), years before Nvidia’s founding. 

    You are right that it is not a new idea.
    You are correct. So it may be that Nvidia took an already existing idea and made it mainstream with a widely used application for it: GPU-based data center and cloud computing. Which is akin to what Apple does, correct? So here, Apple is taking a workstation/server/data center idea and applying it to personal computing. Pretty neat. 

    I wonder if Apple will adopt this approach for their smartphones and tablets? Because right now they are still buying mobile RAM modules for phones and tablets from Samsung if I am correct. Building it into the SOCs would increase performance and save money. (Although perhaps at the cost of increased power consumption and heat generation.)
  • Reply 20 of 84
    radarthekatradarthekat Posts: 3,842moderator
    kkqd1337 said:
    flydog said:
    kkqd1337 said:
    I'm dubious of their claims. But i'm all for it if its true. I'll probably wait for M2 tho as I dont need anything right now.

    One thought I had....

    if these chips are so amazing.... why are they not making servers? using them in data centres, thats where power per watt really matters isnt it?
    You're dubious of Apple's claims?  So your position until you see evidence is that Apple is committing fraud by misrepresenting the capability of its hardware. 
    yea. you hit the nail on the head.

    whats the best selling windows computer its meant to be so much better than? some $200 education machine.
    I’ve yet to see a comment that gets Apple’s quote correct.  Apple said, “best selling in its class...”

    So no, they likely weren’t talking about some $200 educational machine.
    lkrupprandominternetpersonaderuttermagman1979Beatsequality72521watto_cobra
Sign In or Register to comment.