Benchmarks show that Intel's Alder Lake chips aren't M1 Max killers

2

Comments

  • Reply 21 of 56
    MplsPMplsP Posts: 3,925member
    jkichline said:
    Well if one needs GPU performance - I am running into software for example that simply will not run (bricked) without a beefy GPU...

    from the Macworld article:

      59,774   Apple M1 Max 32 core GPU
    143,594  nVidia 3080 Ti

    240% faster, presumably not 'within margin of error'

    Even more pronounced seem the desktop options (AMD) with the relatively inexpensive nVidia 3060 outperforming passmark scores for many higher priced cards as well as having 12GB VRAM  www.bestbuy.com/site/evga-nvidia-geforce-rtx-3060-xc-gaming-12gb-gddr6-pci-express-4-0-graphics-card/6454329.p?skuId=6454329

    I understood Apple is working on a 'boost' option which may help, and will presumably also ramp up the power and fan requirements...?
    Except that's not the Alder Gate chip. That's the NVIDIA GPU. What's the GPU performance of the Intel graphics on the Alder Gate chip? Does that account for Metal performance boosts?
    True, but it’s a moot point if you need the performance since the Mx chips dont’ support an external/discrete GPU. 

    For the purposes of this article, we’re looking at just the processor, not the graphics so saying xxx graphics chip performs better is missing the point.
    pscooter63
  • Reply 22 of 56
    I wish I could think of a reason to justify buying such a machine.
    Unfortunately there are still a few Windows apps (not games) that will not run on ARM. I would love to ditch my Intel MBP and iMac and go all in on Apple silicon, but until recalcitrant app makers port to Apple silicon, or at least ARM, I have to keep my last gen MBP for a while.  
    edited January 2022 williamlondon9secondkox2watto_cobra
  • Reply 23 of 56
    People who are paying attention to speed claims only might fall for Intel’s marketing…

    They “caught up” in less than a year. Impressive, as long as you leave power consumption out of the equation, which is the EXACT reason Apple ditched Intel. 

    Apple’s ARM chips are way superior, no doubt. The question is if anyone, except us geeks, will understand and care?
    williamlondon9secondkox2pscooter63watto_cobra
  • Reply 24 of 56
    thttht Posts: 5,443member
    Serious question:

    What does an Intel Core i9 do that requires it to be as power inefficient in the same processing circumstances as an AS M1 Max?

    Presumably there's a reason why it draws so much more current to achieve the same ends? Are there features in it that are not replicated in the M1 Max? 

    I'm assuming the architecture is radically different, but what stops Intel from changing to that architecture?
    Power is consumed when a transistor switch from 0 to 1 or 1 to 0. Switching is controlled by clock cycles. The more switching the more power is consumed. 
    Well, that is presumably a given. And possibly at a slightly lower level than I was alluding to. More specifically, is there some set of processing or overall design feature that Intel does wrong? Or does it do more 'stuff' that the M1 doesn't do? Is it required to support legacy ways of doing stuff that the M1 is free from? 
    It basically all comes down to economics. There's a lot of hoo-ha about instruction set architecture (RISC vs CISC), but it's not a big deal imo. It comes down to the economics of how many transistors you can have in a chip, what power budgets the OEM is willing to design for, and whether it is profitable in the end.

    First and foremost, Intel's fabrication technology - how small they can make the transistors - was effectively broken for close to 4 years. 2 CEOs and executive teams were sacked because of this. The smaller the transistors, the more transistors you can put in a chip and the less power it will take to power them. Intel was the pre-eminent manufacturer of computer chips for the better part of 40 years, with 75 to 80% marketshare for most of those years. It takes a lot of mistakes for them to lose their fab lead.

    Two things enabled TSMC, Apple's chip manufacturer, to catch and lap Intel in terms of how small a transistor they could make. The smartphone market became the biggest chip market, both in terms of units and money. It's bigger than PCs and servers. This allowed TSMC to make money, a lot of it Apple fronted, and invest in making smaller and smaller transistors. The other thing was Intel fucked up, on both ends. They decided not to get into the smartphone market (they tried when it became obvious, but failed). They then made certain decisions about the design of their 10nm fab that ended up not working and hence a 4 year delay, allowing TSMC to lap them. Even Samsung caught up and lapped them a little.

    More transistors mean more performance. Apple's chips have a lot more transistors than Intel's, probably by a factor of 2. If you aren't economical to have a lot of transistors, you can increase performance by having higher clock rates. High clock rates mean it will take more power to run. It's not a linear relationship. It's an exponential increase in power consumption. So, Apple's chips have transistors that are about 2x smaller than Intel's, there are more of them, and consequently can design their chips to run at relatively lower clock rates, and consuming less power.

    Intel can theoretically design chips with the same number of transistors as Apple, but the chips will be 2x as large. They will not be profitable doing this. Well, it's really, they will not enjoy their traditional 60% margins if they do it this way, ie, not profitable "enough". So smallish chips with higher power consumption is their way. Apple hates high power consumption chips, and they do it the opposite way (big chips lower power consumption), and you end up with an M1 Pro having about the same performance as an Alder Lake i9-12900H, but with the M1 Pro needing 30 W and the i9 need 80 to 110 W. And Apple has a 2x to 3x more performant on-chip GPU than Intel has. They can not do this if TSMC didn't become the leader in chip manufacturing.

    Intel has plans to regain the fab lead, be able to fab the smallest transistors, but we will see about that. They might, or not.
    patchythepirateAlex_Vcat52IreneWh2ppscooter63muthuk_vanalingamhucom2000watto_cobra
  • Reply 25 of 56
    So I'll go ahead and start the flame war with the cliche, as far as price goes you are *not* comparing apples-to-apples (every pun intended). The MSI has a 17" screen so let's go ahead and actually compare the price of the MSI Raider with 17" screen as seen here (https://www.bhphotovideo.com/c/product/1639843-REG/msi_ge76_raider_11ug_054_17_3_ge76_raider_gaming.html?SID=s1643237475129c3nda52417) which costs $2485 to the 16" MacBook Pro (https://www.apple.com/shop/buy-mac/macbook-pro/16-inch) which costs $3499. Both have 32GB of RAM and 1TB NVMe.

    So maybe a correction or, god forbid, a little honesty in your reporting. Here come the bullet points:
    * Been using macOS since 1988.
    * Also, the 14" M1 Pro Max MacBook Pro with 32GB Ram and 1TB NVMe drive is $2800. So where are you getting your price data from?
    * I don't mind the Apple Fan-boy stuff. Just try to keep it honest.
    * We all know that the M1 is way more power efficient. Period. No arguments from anybody, not even Intel. That isn't my complaint about this article.
    williamlondondarkvader9secondkox2watto_cobra
  • Reply 26 of 56
    dewmedewme Posts: 5,362member
    Meh.

    If I were looking to purchase a Windows PC I’d be trying to figure out which Intel or AMD CPU fit my price and performance requirements for the PC apps I need to run. It would make zero sense in this case to even think about any CPU/SoC made by Apple that is exclusive to Apple computers. 

    Once the residual inventory of Intel Macs dries up the same argument applies to Mac purchases. If I can’t buy a Mac with Intel chips or a PC with Apple Silicon chips why would I waste my time comparing these against one another?

    Macs and PCs have been around for long enough that nearly anyone looking to purchase a new computer already knows which species, Mac vs PC, of computer they’re going to buy before they start thinking about chip level details, much less comparing benchmarks. 

    If every software application in existence ran on both Macs and PCs, if both macOS and Windows were identical in functionality and ease of use, and if all PCs were built with the same quality, fit, and finish of Macs, perhaps then I’d care about chips and benchmarks. Until then, it’s like comparing horses versus cows. Buying a cow when you really needed a horse isn’t going to result in a good outcome for you or for the cow. 

    You may say “Even as Mac users we need to care about what Intel is doing.” To which I’d reply; “No we don’t.” Apple may care, but I don’t care. If Apple someday deems that it can never catch up with Intel on the chip front, they will change over to Intel chips on their Macs. At that point I will care about Intel chips. But not a day sooner. 
    williamlondonAlex_Vcat52pscooter63muthuk_vanalingamwatto_cobra
  • Reply 27 of 56
    thttht Posts: 5,443member
    So I'll go ahead and start the flame war with the cliche, as far as price goes you are *not* comparing apples-to-apples (every pun intended). The MSI has a 17" screen so let's go ahead and actually compare the price of the MSI Raider with 17" screen as seen here (https://www.bhphotovideo.com/c/product/1639843-REG/msi_ge76_raider_11ug_054_17_3_ge76_raider_gaming.html?SID=s1643237475129c3nda52417) which costs $2485 to the 16" MacBook Pro (https://www.apple.com/shop/buy-mac/macbook-pro/16-inch) which costs $3499. Both have 32GB of RAM and 1TB NVMe.

    So maybe a correction or, god forbid, a little honesty in your reporting. Here come the bullet points:
    * Been using macOS since 1988.
    * Also, the 14" M1 Pro Max MacBook Pro with 32GB Ram and 1TB NVMe drive is $2800. So where are you getting your price data from?
    * I don't mind the Apple Fan-boy stuff. Just try to keep it honest.
    * We all know that the M1 is way more power efficient. Period. No arguments from anybody, not even Intel. That isn't my complaint about this article.
    The machine you listed uses a Core i7-11800H with 8 cores. Ie, Tiger Lake platform. The AI article is talking about Alder Lake processors which are heterogenous MP architectures (big cores and small cores), 12th gen, which will be available for purchase maybe in a month? Who knows. The review embargo just lifted, so that's why you are seeing articles. The AI article is basically an article about some reviews, and they noted that the Alder Lake machine is $4000. It will be cheaper in a few months as the PC world has lots of deals and is pretty cut throat on the margins.

    As has been true for a while now. MBP laptops are not gaming laptops. They are basically high macOS laptops used for STEM and content creation. I don't think people will be cross-shopping them much. I wouldn't touch the MSI at all. 1080p display is an automatic no. 4K is a sure. If it is hot, basically a no.
    edited January 2022 williamlondonpatchythepiratecat52watto_cobra
  • Reply 28 of 56
    Intel are such a bunch of sore losers, they pulled a similar stunt when they found out how successful the iPhone has become and that it wasn't using Intel chips.
    watto_cobra
  • Reply 29 of 56
    Wait , they need 14 cores to get 3 and 5% over a 10 core CPU. That chip needs a rework.
    watto_cobra
  • Reply 30 of 56
    Serious question:

    What does an Intel Core i9 do that requires it to be as power inefficient in the same processing circumstances as an AS M1 Max?

    Presumably there's a reason why it draws so much more current to achieve the same ends? Are there features in it that are not replicated in the M1 Max? 

    I'm assuming the architecture is radically different, but what stops Intel from changing to that architecture?
    Nothing stops Intel from licensing ARM's instructionset or architecture. The main problem is all of Windows assets are written for their CISC processors, they cannot just go and change architecture, those assets would have to be rewritten, recompiled, and re-tested. Something that Apple was able to do because they planned the migration years in advance and prepared a solid version of Rosetta. PC manufacturers do not have the influence to pull such an act. Think from small software drivers to massive applications like Adobe's. Apple worked with Adobe on this for years as can be seen with their ARM versions of Illustrator and PhotoShop that were demoed as far back as 2018.
    Intel has also become complacent, they called Apple "a pain in the ass" when they demanded more performance per watt, they just wanted to continue on the same "successful" and continue to control majority of the market.
    Apple is now able to produce their own SoC specifically designed for a piece of hardware, they can build custom silicon for every product, from AirPods to AppleCar, something that no other company will be able to do.
    Alex_Vfastasleepcat52pscooter63patchythepiratewatto_cobra
  • Reply 31 of 56
    jkichline said:
    Well if one needs GPU performance - I am running into software for example that simply will not run (bricked) without a beefy GPU...

    from the Macworld article:

      59,774   Apple M1 Max 32 core GPU
    143,594  nVidia 3080 Ti

    240% faster, presumably not 'within margin of error'

    Even more pronounced seem the desktop options (AMD) with the relatively inexpensive nVidia 3060 outperforming passmark scores for many higher priced cards as well as having 12GB VRAM  www.bestbuy.com/site/evga-nvidia-geforce-rtx-3060-xc-gaming-12gb-gddr6-pci-express-4-0-graphics-card/6454329.p?skuId=6454329

    I understood Apple is working on a 'boost' option which may help, and will presumably also ramp up the power and fan requirements...?
    Except that's not the Alder Gate chip. That's the NVIDIA GPU. What's the GPU performance of the Intel graphics on the Alder Gate chip? Does that account for Metal performance boosts?
    I understood the 3080 Ti as mobile and the 3080 as the desktop option. That said Zimmie's explanation seems helpful, and memory shuttling and a 2TB M2 for the SSG might make even more sense given the modest M2 cost...

    For what it is worth I value quiet operation at the expense of performance unless needed, and yet if developers start bricking apps for their preferred GPU configuration is that more troubling than debating top spec performance, and the next great Apple Trojan Horse to 'encourage' hardware sales...?
    edited January 2022 watto_cobra
  • Reply 32 of 56
    So I'll go ahead and start the flame war with the cliche, as far as price goes you are *not* comparing apples-to-apples (every pun intended). The MSI has a 17" screen so let's go ahead and actually compare the price of the MSI Raider with 17" screen as seen here (https://www.bhphotovideo.com/c/product/1639843-REG/msi_ge76_raider_11ug_054_17_3_ge76_raider_gaming.html?SID=s1643237475129c3nda52417) which costs $2485 to the 16" MacBook Pro (https://www.apple.com/shop/buy-mac/macbook-pro/16-inch) which costs $3499. Both have 32GB of RAM and 1TB NVMe.

    So maybe a correction or, god forbid, a little honesty in your reporting. Here come the bullet points:
    * Been using macOS since 1988.
    * Also, the 14" M1 Pro Max MacBook Pro with 32GB Ram and 1TB NVMe drive is $2800. So where are you getting your price data from?
    * I don't mind the Apple Fan-boy stuff. Just try to keep it honest.
    * We all know that the M1 is way more power efficient. Period. No arguments from anybody, not even Intel. That isn't my complaint about this article.
    You are comparing Apple to Oranges. The MSI has a 17" screen has far inferior resolution than MBP Pro 16" which has a 5K screen. Further, with much less pixels MSI gaming program needs to compute far less pixels than MBP. 
    cat52watto_cobra
  • Reply 33 of 56
    tht said:
    Serious question:

    What does an Intel Core i9 do that requires it to be as power inefficient in the same processing circumstances as an AS M1 Max?

    Presumably there's a reason why it draws so much more current to achieve the same ends? Are there features in it that are not replicated in the M1 Max? 

    I'm assuming the architecture is radically different, but what stops Intel from changing to that architecture?
    Power is consumed when a transistor switch from 0 to 1 or 1 to 0. Switching is controlled by clock cycles. The more switching the more power is consumed. 
    Well, that is presumably a given. And possibly at a slightly lower level than I was alluding to. More specifically, is there some set of processing or overall design feature that Intel does wrong? Or does it do more 'stuff' that the M1 doesn't do? Is it required to support legacy ways of doing stuff that the M1 is free from? 
    It basically all comes down to economics. There's a lot of hoo-ha about instruction set architecture (RISC vs CISC), but it's not a big deal imo. It comes down to the economics of how many transistors you can have in a chip, what power budgets the OEM is willing to design for, and whether it is profitable in the end.

    First and foremost, Intel's fabrication technology - how small they can make the transistors - was effectively broken for close to 4 years. 2 CEOs and executive teams were sacked because of this. The smaller the transistors, the more transistors you can put in a chip and the less power it will take to power them. Intel was the pre-eminent manufacturer of computer chips for the better part of 40 years, with 75 to 80% marketshare for most of those years. It takes a lot of mistakes for them to lose their fab lead.

    Two things enabled TSMC, Apple's chip manufacturer, to catch and lap Intel in terms of how small a transistor they could make. The smartphone market became the biggest chip market, both in terms of units and money. It's bigger than PCs and servers. This allowed TSMC to make money, a lot of it Apple fronted, and invest in making smaller and smaller transistors. The other thing was Intel fucked up, on both ends. They decided not to get into the smartphone market (they tried when it became obvious, but failed). They then made certain decisions about the design of their 10nm fab that ended up not working and hence a 4 year delay, allowing TSMC to lap them. Even Samsung caught up and lapped them a little.

    More transistors mean more performance. Apple's chips have a lot more transistors than Intel's, probably by a factor of 2. If you aren't economical to have a lot of transistors, you can increase performance by having higher clock rates. High clock rates mean it will take more power to run. It's not a linear relationship. It's an exponential increase in power consumption. So, Apple's chips have transistors that are about 2x smaller than Intel's, there are more of them, and consequently can design their chips to run at relatively lower clock rates, and consuming less power.

    Intel can theoretically design chips with the same number of transistors as Apple, but the chips will be 2x as large. They will not be profitable doing this. Well, it's really, they will not enjoy their traditional 60% margins if they do it this way, ie, not profitable "enough". So smallish chips with higher power consumption is their way. Apple hates high power consumption chips, and they do it the opposite way (big chips lower power consumption), and you end up with an M1 Pro having about the same performance as an Alder Lake i9-12900H, but with the M1 Pro needing 30 W and the i9 need 80 to 110 W. And Apple has a 2x to 3x more performant on-chip GPU than Intel has. They can not do this if TSMC didn't become the leader in chip manufacturing.

    Intel has plans to regain the fab lead, be able to fab the smallest transistors, but we will see about that. They might, or not.
    Intel has lost the PC performance war. It can still enjoy over 50% profit margin primarily due to servers. The same is true for Microsoft. The growth market of hi-tech now is cloud computing which requires a lot of servers with tremendous amount of memory. 
    watto_cobrakillroy
  • Reply 34 of 56
    rob53 said:
    Max Tech (YouTube) needs to read this. 
    Garbage YouTube channel.
    fastasleepwatto_cobrakillroy
  • Reply 35 of 56
    darkvaderdarkvader Posts: 1,146member

    Apple is now able to produce their own SoC specifically designed for a piece of hardware, they can build custom silicon for every product, from AirPods to AppleCar, something that no other company will be able to do.

    Nope.  Apple has ZERO chip production capability.  Apple builds virtually nothing.  The only actual Apple 'factories' are a final assembly facility in Cork, Ireland and a final assembly facility in Texas.  ALL the components those facilities use are imported, mostly from China.  Everything else Apple 'builds' is actually manufactured by other companies, mostly Foxconn in China.

    williamlondoncat52pscooter63
  • Reply 36 of 56
    Soooo… Inteput out fake news and got free hype. 

    Reality is the new processor makes power the dirty way by guzzling electricity and it still basically just matches the M1 Max. 

    Apples first effort and it’s a mobile chip versus intel desktop chip in a mobile but huge computer  - and it takes intel a quarter of a year to come out with a processor that matches the m series in performance and loses horrifically everywhere else. 

    And they had to use a screen that’s only HD’s to artificially make the intel benchmarks higher by pushing 1/4 of the pixels the Mac pushes. What a joke. 
    patchythepiratewatto_cobrakillroy
  • Reply 37 of 56
    So I'll go ahead and start the flame war with the cliche, as far as price goes you are *not* comparing apples-to-apples (every pun intended). The MSI has a 17" screen so let's go ahead and actually compare the price of the MSI Raider with 17" screen as seen here (https://www.bhphotovideo.com/c/product/1639843-REG/msi_ge76_raider_11ug_054_17_3_ge76_raider_gaming.html?SID=s1643237475129c3nda52417) which costs $2485 to the 16" MacBook Pro (https://www.apple.com/shop/buy-mac/macbook-pro/16-inch) which costs $3499. Both have 32GB of RAM and 1TB NVMe.

    So maybe a correction or, god forbid, a little honesty in your reporting. Here come the bullet points:
    * Been using macOS since 1988.
    * Also, the 14" M1 Pro Max MacBook Pro with 32GB Ram and 1TB NVMe drive is $2800. So where are you getting your price data from?
    * I don't mind the Apple Fan-boy stuff. Just try to keep it honest.
    * We all know that the M1 is way more power efficient. Period. No arguments from anybody, not even Intel. That isn't my complaint about this article.
    You DO know that 17” inch screen you tout so highly is only HD right? 

    It should be illegal to sell such a crappy screen in 17” format in 2022. 

    Most likely MSI did that so as to make performance benchmarks look better than the chips are. 1/4 of the pixels Apple provides. Only has to push 25% of the pixels Apple does. 

    Once you get past all the the asterisks, alder lake isn’t looking so hot…
    patchythepiratewatto_cobrakillroy
  • Reply 38 of 56
    fastasleepfastasleep Posts: 6,417member
    darkvader said:

    Apple is now able to produce their own SoC specifically designed for a piece of hardware, they can build custom silicon for every product, from AirPods to AppleCar, something that no other company will be able to do.

    Nope.  Apple has ZERO chip production capability.  Apple builds virtually nothing.  The only actual Apple 'factories' are a final assembly facility in Cork, Ireland and a final assembly facility in Texas.  ALL the components those facilities use are imported, mostly from China.  Everything else Apple 'builds' is actually manufactured by other companies, mostly Foxconn in China.

    JFC. They obviously didn't mean literally build with their own hands. But they design it and TMSC manufactures it. Nobody said anything about Apple chip factories, troll.
    bloggerblogauxiowatto_cobrakillroy
  • Reply 39 of 56
    fastasleepfastasleep Posts: 6,417member
    So I'll go ahead and start the flame war with the cliche, as far as price goes you are *not* comparing apples-to-apples (every pun intended). The MSI has a 17" screen so let's go ahead and actually compare the price of the MSI Raider with 17" screen as seen here (https://www.bhphotovideo.com/c/product/1639843-REG/msi_ge76_raider_11ug_054_17_3_ge76_raider_gaming.html?SID=s1643237475129c3nda52417) which costs $2485 to the 16" MacBook Pro (https://www.apple.com/shop/buy-mac/macbook-pro/16-inch) which costs $3499. Both have 32GB of RAM and 1TB NVMe.

    So maybe a correction or, god forbid, a little honesty in your reporting. Here come the bullet points:
    * Been using macOS since 1988.
    * Also, the 14" M1 Pro Max MacBook Pro with 32GB Ram and 1TB NVMe drive is $2800. So where are you getting your price data from?
    * I don't mind the Apple Fan-boy stuff. Just try to keep it honest.
    * We all know that the M1 is way more power efficient. Period. No arguments from anybody, not even Intel. That isn't my complaint about this article.
    You are comparing Apple to Oranges. The MSI has a 17" screen has far inferior resolution than MBP Pro 16" which has a 5K screen. Further, with much less pixels MSI gaming program needs to compute far less pixels than MBP. 
    What? As much as I love mine, the 16" MBP doesn't even have a 4K screen — it's 3456x2234 — much less 5K.
    dewmewatto_cobra
  • Reply 40 of 56
    danoxdanox Posts: 2,849member
    Well if one needs GPU performance - I am running into software for example that simply will not run (bricked) without a beefy GPU...

    from the Macworld article:

      59,774   Apple M1 Max 32 core GPU
    143,594  nVidia 3080 Ti

    240% faster, presumably not 'within margin of error'

    Even more pronounced seem the desktop options (AMD) with the relatively inexpensive nVidia 3060 outperforming passmark scores for many higher priced cards as well as having 12GB VRAM  www.bestbuy.com/site/evga-nvidia-geforce-rtx-3060-xc-gaming-12gb-gddr6-pci-express-4-0-graphics-card/6454329.p?skuId=6454329

    I understood Apple is working on a 'boost' option which may help, and will presumably also ramp up the power and fan requirements...?

    So 2 hours of battery life…..
    9secondkox2watto_cobrakillroy
Sign In or Register to comment.