M3 Ultra could have up to 80 graphics cores

Posted:
in Current Mac Hardware edited January 24

Apple's M3 Ultra chip will probably have 80 graphics cores available to users, a report proposes, with what could be the last remaining M3-generation chip expected to be massively powerful.




The Apple Silicon lineup in M1 and M2 established a pattern for the chip families, with the Ultra edition effectively offering twice the core counts and other elements of the Max version. This is because Apple effectively connected two Max chips with an interconnect and called them the M1 Ultra and M2 Ultra.

In Sunday's "Power On" newsletter for Bloomberg, Mark Gurman latches on to to the doubling mechanic in discussing what could end up being the M3 Ultra.

Gurman starts with the proposal that the M3 Ultra could have "a whopping 80 graphics cores." That is before explaining that the generations of Apple Silicon have a baseline version, a "Pro" with more CPU and GPU cores, a Max with double the graphics cores, and the Ultra that doubles both types of cores compared to the Max.

For M2, the Pro had up to 12 CPU cores and 19 graphics cores, the M2 Max had 12 CPU cores and 38 graphics cores, and the M2 Ultra went up to 24 CPU cores and 76 graphics cores.

However, "Apple deviated a bit from that approach with the new M3 line," Gurman writes. The M3 Max included more CPU cores as well as a doubling of GPU cores.

"This has implications for the M3 Ultra, which Apple hasn't announced yet," it is proposed. If continues doubling CPU and graphics cores for the Ultra, the Mac chip could end up with 32 CPU cores and 80 graphics cores.

Similarly, if memory is updated again, it's probable that Apple could include a configuration option for 256 gigabytes.

Gurman believes that the details of the Ultra edition will be known within months as testing commences, and that a launch will happen sometime in 2024.

Rumor Score: Likely

Read on AppleInsider

FileMakerFeller
«1

Comments

  • Reply 1 of 31
    Wonder if the price will be close to the prior generation like last time?

    I may be tempted this time and take my M1 Ultra Mac Studio (128GB & 8TB SSD) and use it as my file server to replace the aging and soon to be unsupported Intel i7 powered Mac mini (64GB OWC Ram and 2 TB soldered SSD). The M1 Mac Studio trade in value is less than a possible M3 Pro mini with maximum memory and a 2 TB SD. and probably will be supported for quite a few more years.

    I remember in the 70s that a 32kb memory card from Digital Equipment Corp (DEC) for my PDP-11 cost $2,000. Every generation that followed in the same form factor doubled the amount of memory for the same $2,000. The last one was 4MB. I used 1.5MB for programs and 2.5MB as a virtual swap disc for compiling Dibol code which was blazingly faster than any of the spinning platters of the day.
    edited November 2023 StrangeDayswatto_cobraFileMakerFellerargonautrezwits
  • Reply 2 of 31
    XedXed Posts: 2,580member
    ApplePoor said:
    Wonder if the price will be close to the prior generation like last time?

    I may be tempted this time and take my M1 Ultra Mac Studio (128GB & 8TB SSD) and use it as my file server to replace the aging and soon to be unsupported Intel i7 powered Mac mini (64GB OWC Ram and 2 TB soldered SSD). The M1 Mac Studio trade in value is less than a possible M3 Pro mini with maximum memory and a 2 TB SD. and probably will be supported for quite a few more years.

    I remember in the 70s that a 32kb memory card from Digital Equipment Corp (DEC) for my PDP-11 cost $2,000. Every generation that followed in the same form factor doubled the amount of memory for the same $2,000. The last one was 4MB. I used 1.5MB for programs and 2.5MB as a virtual swap disc for compiling Dibol code which was blazingly faster than any of the spinning platters of the day.
    My guess is that it will be close M2 Ultra price. While impressive, the number of customers for the amount of effort to design and test the interconnect, along with the testing the full Ultra chip is a small subset of buyers. I'd even go so far to say that a part of their development is simply for marketing the power of Apple Silicon and its scalability.

    Of course, Apple may have found ways to reduce costs along the way so hopefully it will get a hefty price reduction, but I doubt it.
    d_2watto_cobramuthuk_vanalingam
  • Reply 3 of 31
    And remember, $2,000 in the 70's was much more than $2,000 today.

    But while more cores are nice, I'd prefer more time spent on "smart" utilization of those cores. For example, over the decades there have been many development languages for Mac, Current examples would be Xojo, X, and Swift. But trying to take advantage of those multiple cores is a major challenge with them and often the overhead - especially when development time and maintenance are factored in - exceeds the benefit.

    A simple answer is, "Well that application is not one fit for parallel processing." But is that always the case? I'd like the compiler to say, "Well, well, well, looks like I have 8 (or more) CPU cores. How can I segment this code so it executes in the shortest possible time?"

    In other words, it seems like the field where it's beneficial to have multiple cores is pretty narrow. Sure, it makes great marketing copy. But it seems that unless one is actively involved in video editing, multiple cores, CPU or GPU, will hardly make any difference.
    williamlondonwatto_cobramobird
  • Reply 4 of 31
    Why stop at 80. Why not 90, or 100, or a bazillion. The skies the limit. 
    9secondkox2watto_cobrawilliamlondon
  • Reply 5 of 31
    32 CPU cores and 80 GPU cores and a blazing fast neural engine. If i had the money i would buy it. A computer like this makes me loose all rational thinking, ability to use consonants in speech, mutter 1.21 gigawatts and pine for the fjords.

    My workload use cases do not demand as much horsepower, but i can imagine that it must be an amazing time to work with heavy lifting workloads. Apple Studio with M?Ultra is very reasonably priced, seems cheap to me actually. I remember back in the day when people used to pony up a small fortune to get a Sparc station 5 on their desk.

    Life as a computer enthusiast has never been this good and Apple leads the way. 

    Does anyone feel there is any merit to the rumours of a further interconnect between 2 M?Ultras into something even more extreme? 
     
    Would there even be an idea to package up many M3 Ultras into compute nodes like Nvidia is doing with their chips? The power draw from the M3 Ultra is nothing compared to their chips. Maybe this is something for Apple’s iCloud.
    StrangeDays9secondkox2watto_cobraFileMakerFellerargonaut
  • Reply 6 of 31
    thttht Posts: 5,454member
    32 CPU cores and 80 GPU cores and a blazing fast neural engine. If i had the money i would buy it. A computer like this makes me loose all rational thinking, ability to use consonants in speech, mutter 1.21 gigawatts and pine for the fjords.

    My workload use cases do not demand as much horsepower, but i can imagine that it must be an amazing time to work with heavy lifting workloads. Apple Studio with M?Ultra is very reasonably priced, seems cheap to me actually. I remember back in the day when people used to pony up a small fortune to get a Sparc station 5 on their desk.

    Life as a computer enthusiast has never been this good and Apple leads the way. 

    Does anyone feel there is any merit to the rumours of a further interconnect between 2 M?Ultras into something even more extreme? 
     
    Would there even be an idea to package up many M3 Ultras into compute nodes like Nvidia is doing with their chips? The power draw from the M3 Ultra is nothing compared to their chips. Maybe this is something for Apple’s iCloud.
    The last rumor on this was they pushed it out to 2027, 2028 type timeframe. The multi-core scaling is already poor going from Max to Ultra configs. It'll be even worse for an Extreme version. They have a lot of work to do to get it those GPU cores to scale.

    Doubtful that Apple enters any server hardware market. They basically stick to products and service for consumers. They barely even try to serve the education market as at it. Network servers? Requires even more commitment than the gaming market.
    edited November 2023 9secondkox2watto_cobra
  • Reply 7 of 31
    Um… isn’t 80 cores for m3 ultra a given? Since m3 max has 40?

    unless the formula has changed since the advent of Apple silicon, I think 80 cores is expected. 
    watto_cobrarezwits
  • Reply 8 of 31
    MarvinMarvin Posts: 15,333moderator
    Does anyone feel there is any merit to the rumours of a further interconnect between 2 M?Ultras into something even more extreme? 
     
    Would there even be an idea to package up many M3 Ultras into compute nodes like Nvidia is doing with their chips? The power draw from the M3 Ultra is nothing compared to their chips. Maybe this is something for Apple’s iCloud.
    The chip images between M1 and M3 look like they have a similar layout, the bottom part of M1 Pro (the GPU) is doubled in the Max chip, CPU is the same 10-core. GPU in M1 Pro is 16-core, Max is 32-core:



    In the M3 image, the Pro chip is upside-down but the same GPU part is roughly doubled, M3 Pro has 18-core, M3 Max has 40-core. However, the Max chip is wider now and has a faster CPU (16-core vs 12-core):



    The Ultra chip joins two Max chips along the bottom edge:



    It's likely they will do the same with M3. But there could be an option to put another GPU component in the middle to make a 3x GPU.

    M3 Max is 17TFLOPs (40-core), 2x would be 34TFLOPs (80-core), 3x would be 51TFLOPs (120-core). It would still be short of a 4090 with 3x but basically the same as a maxed out 2019 Mac Pro and around the same as a laptop 4090.

    I expect the Mac Studio will top out at a dual chip. The Mac Pro would have more room for an extreme version with extra GPU cores and could add more memory (384GB). They might not feel that investing in a custom chip is worthwhile for such a small shipment volume though (<10k units).
    edited November 2023 9secondkox2watto_cobraaderutterwilliamlondonargonaut
  • Reply 9 of 31
    Marvin said:
    Does anyone feel there is any merit to the rumours of a further interconnect between 2 M?Ultras into something even more extreme? 
     
    Would there even be an idea to package up many M3 Ultras into compute nodes like Nvidia is doing with their chips? The power draw from the M3 Ultra is nothing compared to their chips. Maybe this is something for Apple’s iCloud.
    The chip images between M1 and M3 look like they have a similar layout, the bottom part of M1 Pro (the GPU) is doubled in the Max chip, CPU is the same 10-core. GPU in M1 Pro is 16-core, Max is 32-core:



    In the M3 image, the Pro chip is upside-down but the same GPU part is roughly doubled, M3 Pro has 18-core, M3 Max has 40-core. However, the Max chip is wider now and has a faster CPU (16-core vs 12-core):



    The Ultra chip joins two Max chips along the bottom edge:



    It's likely they will do the same with M3. But there could be an option to put another GPU component in the middle to make a 3x GPU.

    M3 Max is 17TFLOPs (40-core), 2x would be 34TFLOPs (80-core), 3x would be 51TFLOPs (120-core). It would still be short of a 4090 with 3x but basically the same as a maxed out 2019 Mac Pro and around the same as a laptop 4090.

    I expect the Mac Studio will top out at a dual chip. The Mac Pro would have more room for an extreme version with extra GPU cores and could add more memory (384GB). They might not feel that investing in a custom chip is worthwhile for such a small shipment volume though (<10k units).
    Interesting. I posted elsewhere on ai that apple could surround the ultra with more GPU cores on the SOC, but actually I think your take is more plausible. 

    Until Apple is able to outperform competitors’ top GPUs further watt for watt in order to match or surpass much hotter, more power hungry discrete GPUs, I can see apple doing this very thing within two years. 

     It’s amazing already that apple is already around 3090 level with just the max. 4090 is crazy though. I have to believe apple wants to catch that horse too before Nvidia pushes ahead again. But apple will want to keep efficiency bragging rights and so I don’t think they’ll quite reach it without adding cores in this manner. 

    The trick will be economizing wafers of just GPU cores with interconnects. It will be some tricky math, but completely possible. 

    Actually… Apple could add GPU cores to the Max chips this way too - sort of a “Max+” chip if you will. This would allow for a greater push into not only motion design/cidro supremacy, but also really enable Apple to go after the gaming market they are showing hints of fresh interest in.
    edited November 2023 watto_cobraargonaut
  • Reply 10 of 31
    MarvinMarvin Posts: 15,333moderator
    Marvin said:
    Does anyone feel there is any merit to the rumours of a further interconnect between 2 M?Ultras into something even more extreme? 
     
    Would there even be an idea to package up many M3 Ultras into compute nodes like Nvidia is doing with their chips? The power draw from the M3 Ultra is nothing compared to their chips. Maybe this is something for Apple’s iCloud.
    The chip images between M1 and M3 look like they have a similar layout, the bottom part of M1 Pro (the GPU) is doubled in the Max chip, CPU is the same 10-core. GPU in M1 Pro is 16-core, Max is 32-core:

    In the M3 image, the Pro chip is upside-down but the same GPU part is roughly doubled, M3 Pro has 18-core, M3 Max has 40-core. However, the Max chip is wider now and has a faster CPU (16-core vs 12-core):

    The Ultra chip joins two Max chips along the bottom edge:

    It's likely they will do the same with M3. But there could be an option to put another GPU component in the middle to make a 3x GPU.

    M3 Max is 17TFLOPs (40-core), 2x would be 34TFLOPs (80-core), 3x would be 51TFLOPs (120-core). It would still be short of a 4090 with 3x but basically the same as a maxed out 2019 Mac Pro and around the same as a laptop 4090.

    I expect the Mac Studio will top out at a dual chip. The Mac Pro would have more room for an extreme version with extra GPU cores and could add more memory (384GB). They might not feel that investing in a custom chip is worthwhile for such a small shipment volume though (<10k units).
    Interesting. I posted elsewhere on ai that apple could surround the ultra with more GPU cores on the SOC, but actually I think your take is more plausible. 

    Until Apple is able to outperform competitors’ top GPUs further watt for watt in order to match or surpass much hotter, more power hungry discrete GPUs, I can see apple doing this very thing within two years. 

     It’s amazing already that apple is already around 3090 level with just the max. 4090 is crazy though. I have to believe apple wants to catch that horse too before Nvidia pushes ahead again. But apple will want to keep efficiency bragging rights and so I don’t think they’ll quite reach it without adding cores in this manner. 

    The trick will be economizing wafers of just GPU cores with interconnects. It will be some tricky math, but completely possible. 

    Actually… Apple could add GPU cores to the Max chips this way too - sort of a “Max+” chip if you will. This would allow for a greater push into not only motion design/cidro supremacy, but also really enable Apple to go after the gaming market they are showing hints of fresh interest in.
    Max+ seems plausible. The bottom edge of M3 looks different from M1, it's like they missed out some of the Ultrafusion connector and chopped it at the edge of the GPU cores.

    Potentially this would allow them to do multiple variants of the Max/Ultra chip. The base Ultra would just put the Ultrafusion on the bottom but the Extreme version would put more GPU cores then the Ultrafusion connector. That way they have a single Ultrafusion connection in both chips.

    Ultra = 2x 40-core Max
    Extreme = 2x 60-core Max+

    It would be a bit much to put a 60-core Max+ in a MBP but it could be an option in the Studio between Max and Ultra around $3k.

    In some tests, the 4090 is 3x M3 Max, others it's 4x so a 3x chip would be competitive in a lot of real-world cases. The Nvidia 5090 isn't expected until 2025 and Apple will probably have an M4 Ultra then with maybe 25% more cores.

    https://www.kitguru.net/components/graphic-cards/joao-silva/rtx-5090-rumoured-be-about-70-faster-than-the-rtx-4090/

    It would still be fairly competitive with a 5090 taking power usage into consideration. An M4 Extreme would be around 400W total. Nvidia GPU + Intel/AMD CPU is 700W.
    9secondkox2williamlondon
  • Reply 11 of 31
    entropysentropys Posts: 4,169member
    32 CPU cores and 80 GPU cores and a blazing fast neural engine. If i had the money i would buy it. A computer like this makes me loose all rational thinking, ability to use consonants in speech, mutter 1.21 gigawatts and pine for the fjords.

    Imagine if it was able to draw on 1.21 jigawatts (as Doc Brown would, say) to power it.

    Funny story: I let my daughter watch Back to the Future when she was eight.  Later she was telling us that she was very scared when the lesbian terrorists shot the Doc . Took a while to work out what she meant, and then ponder why she knew the word “lesbian”.
    edited November 2023 9secondkox2
  • Reply 12 of 31
    tht said:
    32 CPU cores and 80 GPU cores and a blazing fast neural engine. If i had the money i would buy it. A computer like this makes me loose all rational thinking, ability to use consonants in speech, mutter 1.21 gigawatts and pine for the fjords.

    My workload use cases do not demand as much horsepower, but i can imagine that it must be an amazing time to work with heavy lifting workloads. Apple Studio with M?Ultra is very reasonably priced, seems cheap to me actually. I remember back in the day when people used to pony up a small fortune to get a Sparc station 5 on their desk.

    Life as a computer enthusiast has never been this good and Apple leads the way. 

    Does anyone feel there is any merit to the rumours of a further interconnect between 2 M?Ultras into something even more extreme? 
     
    Would there even be an idea to package up many M3 Ultras into compute nodes like Nvidia is doing with their chips? The power draw from the M3 Ultra is nothing compared to their chips. Maybe this is something for Apple’s iCloud.
    The last rumor on this was they pushed it out to 2027, 2028 type timeframe. The multi-core scaling is already poor going from Max to Ultra configs. It'll be even worse for an Extreme version. They have a lot of work to do to get it those GPU cores to scale.

    Doubtful that Apple enters any server hardware market. They basically stick to products and service for consumers. They barely even try to serve the education market as at it. Network servers? Requires even more commitment than the gaming market.


    Ah, I didn’t say for Apple to join the server market and I had a lot of implicit things embedded in the statement.

    What I meant is the following:

    hyperscalers are investing a lot into gpus from Nvidia and many of them also build their own server silicon (aws graviton, google tpu etc)

    Apple has serious silicon chops and a massive efficiency lead in compute and maybe also gpu. Note power draw of Nvidia as an example.

    Data Center energy use is a huge deal for density of compute and costs for run and cooling.

    Apple is a hyper scaler in their own right so it may follow that they build their own server stack using their efficient M series which may also amortise costs and may be cheaper than buying Nvidia et al.

    your comments on scaling is a clear signal that this may not work well and further Nvidia DGX etc has a lot of advanced interconnects etc.



    watto_cobra
  • Reply 13 of 31
    keithwkeithw Posts: 141member
    It seems certain that we’ll get an M3 Ultra.  If they weren’t going to do one, they probably would have already released an M3 Max Studio to go along with the laptops.
    watto_cobra
  • Reply 14 of 31
    keithw said:
    It seems certain that we’ll get an M3 Ultra.  If they weren’t going to do one, they probably would have already released an M3 Max Studio to go along with the laptops.
    Literally zero doubt. It’s a staple. The real speculation is about what Apple with come up with above and beyond that. Apple wants to be the place where everyone comes to get the best. Right now, they have that covered except in one area: GPU performance. Even an m3 ultra, which will blow past most of the big dog discrete graphics cars from amd and Nvidia , won’t compete at the very highest end. It seems apple wants to press for that crown, but they’re unwilling to compromise on efficiency, which is a great thing. Where they have to go from here is twofold: better architecture and higher core counts. Apple may not want to package the Max and Ultra SOC so large with so many more GPU cores. But if they add an interconnected GPU booster (would be cool if the repurposed the afterburner brand) with 20 - 40 more GPU cores, that would be something. Especially useful in the case of the Ultra where the CPU core count could be considered overkill, but still allow massive GPU upgrades. As it is, the m3 series is fantastic in performance, especially considering the comparatively miserly power draw. The Ultra is going to lay the smack down. If Apple goes beyond that, things will get absolutely insane. It will be enough for Nvidia to be as worried as Intel has already been. The challenges lie in economy of scale. Apple has been able to do apple silicon because they’ve already amortized the majority of development via iPhone every year. The derivatives are a fraction of that cost. Even a GPU core pack using the fabric connection wouldn’t be that difficult. But if they did it, they’d have to invest who knows how many wafers and capacity to churn them out for who knows how many customers. But m3 has made apple silicon exciting all over again and it’s a solid enough leap to spur sn upgrade cycle as well as entice new customers. The real barrier is inflation. Apple has increased prices in response, no doubt hoping to make up for volume sales losses via profit margins during times of “famine.” But for appreciable sales numbers to thrive again, Apple will have to start getting more reasonable with pricing at some point. Perhaps by the m4 generation where the 3nm production will be more of a known quantity and no longer be novel. Perhaps that will result in a product lineup shift as well (large iMac return, etc). This would be a safer market to start playing with GPU core boosters, etc. But back to today and the subject immediately at hand: Apple is 100% releasing the staple m3 Ultra and it will have an option of 80 cores. It’s literally two m3 Max SOCs connected together as has been the case since m1. The only question mark is if apple finds a way to add MORE than 80 cores in this generation.
    watto_cobra
  • Reply 15 of 31
    And remember, $2,000 in the 70's was much more than $2,000 today.

    But while more cores are nice, I'd prefer more time spent on "smart" utilization of those cores. For example, over the decades there have been many development languages for Mac, Current examples would be Xojo, X, and Swift. But trying to take advantage of those multiple cores is a major challenge with them and often the overhead - especially when development time and maintenance are factored in - exceeds the benefit.

    A simple answer is, "Well that application is not one fit for parallel processing." But is that always the case? I'd like the compiler to say, "Well, well, well, looks like I have 8 (or more) CPU cores. How can I segment this code so it executes in the shortest possible time?"

    In other words, it seems like the field where it's beneficial to have multiple cores is pretty narrow. Sure, it makes great marketing copy. But it seems that unless one is actively involved in video editing, multiple cores, CPU or GPU, will hardly make any difference.
    You describe the challenges of every SMP architecture that exists.  Code needs to be written specifically to take advantage of parallelism.  It’s more than just a compiler task.  For 90 percent of the Mac market what matters most is the single core performance of the CPU’s.  The marketing folks at Apple love to emphasize core counts.  As you grow the number of cores you add capacity but you also add overhead managing the dispatching of tasks to those cores.  Apple likes to emphasize their power efficiency.  But that comes at a performance cost.  The highest throughput is achieved if all cores are high performance cores. Unless you are doing video editing which has programs designed for parallelism or gaming which can leverage multiple GPU’s, the throughput one will experience will be defined by the single thread single core performance of the CPU.  There are some scientific programs that can benefit from multiple core SMP, such as gene sequencing, but that’s not the market Apple targets. 
  • Reply 16 of 31
    It seems all four M1 chips had nearly the same single core performance. We saw that all four of the M2 chips also had similar single core speeds and there was about a 15% increase in the average number. So, for me, there was no incentive to upgrade to the M2 series. The first three of four M3 chips repeat the average single core speed  concept of the M1 and M2. But there was a larger percentage increase in the average single core speed over the M2.

    I am intrigued that a M3 Max MacBook Pro can now be acquired that is nearly as powerful as my M1 MacStudio (128GB and 8TB SSD) with the same amount of memory and SSD in just two generations. I thought I had really future proofed my M1 MacStudio with those top go the line options. And the fully configured MacBook Prop price is fairly close to my MacStudio price.

    I thought we had a good performance boost going from the 68030 in my IIci to the 68040 in my IIfx. The Intel years were snoozers in terms of performance  increases over time.

    But I am concerned how many more rabbits are in the hat for this M series chip to continue this pace of spec improvements. The crowd that needs and can afford the top models is far smaller than the more modest needs and prices group. With the lower operating temperatures of the M series computers, one could expect much longer service lives of the M devices. 

    Like the iPhone, the incremental changes are getting smaller and the crowd seems to be wanting to spread their acquisition cost over more years.

    So could Apple turn off operating system support for the M1 series at some future time like they will Intel chips? And how many generations of M series will be supported?

    The computing power necessary for the average non-pro users is already exceeded. One usually does not see a Ferrari V8 in a Karman Gaia.
  • Reply 17 of 31
    y2any2an Posts: 189member
    Let me think. 2 x N = …? Duh. What genius figured this one out!
    williamlondon
  • Reply 18 of 31
    netroxnetrox Posts: 1,422member
    ApplePoor said:
    Wonder if the price will be close to the prior generation like last time?

    I may be tempted this time and take my M1 Ultra Mac Studio (128GB & 8TB SSD) and use it as my file server to replace the aging and soon to be unsupported Intel i7 powered Mac mini (64GB OWC Ram and 2 TB soldered SSD). The M1 Mac Studio trade in value is less than a possible M3 Pro mini with maximum memory and a 2 TB SD. and probably will be supported for quite a few more years.

    I remember in the 70s that a 32kb memory card from Digital Equipment Corp (DEC) for my PDP-11 cost $2,000. Every generation that followed in the same form factor doubled the amount of memory for the same $2,000. The last one was 4MB. I used 1.5MB for programs and 2.5MB as a virtual swap disc for compiling Dibol code which was blazingly faster than any of the spinning platters of the day.
    Just why? You don't need that much RAM or CPU for a file server!!!! I have base Mac mini as a file server and it's very cool (sipping only like 10 watts on average). A file server does not take much RAM at all. It's just transferring files. 

    Ultra Mac is built for media creation especially with videos and photography, not for file servers.
    williamlondonargonaut
  • Reply 19 of 31
    ApplePoor said:
    It seems all four M1 chips had nearly the same single core performance. We saw that all four of the M2 chips also had similar single core speeds and there was about a 15% increase in the average number. So, for me, there was no incentive to upgrade to the M2 series. The first three of four M3 chips repeat the average single core speed  concept of the M1 and M2. But there was a larger percentage increase in the average single core speed over the M2.

    I am intrigued that a M3 Max MacBook Pro can now be acquired that is nearly as powerful as my M1 MacStudio (128GB and 8TB SSD) with the same amount of memory and SSD in just two generations. I thought I had really future proofed my M1 MacStudio with those top go the line options. And the fully configured MacBook Prop price is fairly close to my MacStudio price.

    I thought we had a good performance boost going from the 68030 in my IIci to the 68040 in my IIfx. The Intel years were snoozers in terms of performance  increases over time.

    But I am concerned how many more rabbits are in the hat for this M series chip to continue this pace of spec improvements. The crowd that needs and can afford the top models is far smaller than the more modest needs and prices group. With the lower operating temperatures of the M series computers, one could expect much longer service lives of the M devices. 

    Like the iPhone, the incremental changes are getting smaller and the crowd seems to be wanting to spread their acquisition cost over more years.

    So could Apple turn off operating system support for the M1 series at some future time like they will Intel chips? And how many generations of M series will be supported?

    The computing power necessary for the average non-pro users is already exceeded. One usually does not see a Ferrari V8 in a Karman Gaia.
    The M1 architecture is the same as the M3.  There’s no reason Apple can’t support the M1 as long as it supports the M3
    9secondkox2
  • Reply 20 of 31
    kellie said:
    ApplePoor said:
    It seems all four M1 chips had nearly the same single core performance. We saw that all four of the M2 chips also had similar single core speeds and there was about a 15% increase in the average number. So, for me, there was no incentive to upgrade to the M2 series. The first three of four M3 chips repeat the average single core speed  concept of the M1 and M2. But there was a larger percentage increase in the average single core speed over the M2.

    I am intrigued that a M3 Max MacBook Pro can now be acquired that is nearly as powerful as my M1 MacStudio (128GB and 8TB SSD) with the same amount of memory and SSD in just two generations. I thought I had really future proofed my M1 MacStudio with those top go the line options. And the fully configured MacBook Prop price is fairly close to my MacStudio price.

    I thought we had a good performance boost going from the 68030 in my IIci to the 68040 in my IIfx. The Intel years were snoozers in terms of performance  increases over time.

    But I am concerned how many more rabbits are in the hat for this M series chip to continue this pace of spec improvements. The crowd that needs and can afford the top models is far smaller than the more modest needs and prices group. With the lower operating temperatures of the M series computers, one could expect much longer service lives of the M devices. 

    Like the iPhone, the incremental changes are getting smaller and the crowd seems to be wanting to spread their acquisition cost over more years.

    So could Apple turn off operating system support for the M1 series at some future time like they will Intel chips? And how many generations of M series will be supported?

    The computing power necessary for the average non-pro users is already exceeded. One usually does not see a Ferrari V8 in a Karman Gaia.
    The M1 architecture is the same as the M3.  There’s no reason Apple can’t support the M1 as long as it supports the M3
    There’s no technical reason but there’s always the greed reason. 
    williamlondonmattinoz
Sign In or Register to comment.