Why Apple uses integrated memory in Apple Silicon -- and why it's both good and bad

Posted:
in macOS

Apple's Unified Memory Architecture first brought changes to the Mac with Apple Silicon M1 chips. There are clear architectural benefits for the hardware -- and it is both good and bad for consumers. Here's why.




Apple's Unified Memory Architecture (UMA) was announced in June. 2020 along with its new Apple Silicon CPUs. The UMA has a number of benefits compared to more traditional memory approaches and represents a revolution in both performance and size.

In traditional desktop and laptop computer design, the main system memory known as RAM sits on a system bus which is separate from the CPU and the GPU.

A bus controller is typically required, which uses interrupts when the CPU needs data from the main system memory. Interrupts are hardware signals different parts of a computer use to pause other parts of the system while a task is performed.

Interrupts cause delays in system processing.

So for example, every time the CPU needs to access data in memory or every time the screen needs to be refreshed, an interrupt is generated, the system pauses, and the task completes. When the task is done, the system resumes general processing.

Direct Memory Access (DMA) was introduced later, but due to motherboard sizes and distances, RAM access can still be slow. DMA is a concept in which some computer subsystems can access memory independently of the CPU.

In DMA the CPU initiates a memory transfer, then performs other work. When the DMA operation is complete, the memory controller generates an interrupt signalling the CPU that data is ready.

RAM access is but one type of interrupt in traditional computer architecture. In general the more busses and interrupts, the more performance bottlenecks there are on a computer.

System on a chip



Graphics Processing Units (GPUs) and game consoles have long solved this problem by integrating components into single chips which eliminates busses and interrupts. GPUs for example, usually have its own RAM attached to the chip, which speeds up processing and allows for faster graphics.

This System on a Chip (SoC) design is the new trend in system and CPU design because it both increases speed and reduces component counts - which reduces the overall cost of products.

It also allows systems to be smaller. Smartphones have long used SoC designs to reduce size and save power, such as with Apple's own iPhone ARM SoC.

Sony's PlayStation 2 was the first consumer game console to ship with an integrated SoC called the Emotion Engine which integrated over a dozen traditional components and subsystems onto a single die.

Apple's M1 and M2 ARM-based chip designs are similar. They are essentially a SoC design that integrates CPUs, GPUs, main RAM, and other components into a single chip.

With this design, instead of the CPU having to access RAM contents across a memory bus, the RAM is connected directly to the CPU. When the CPU needs to store or retrieve data in RAM it simply goes directly to the RAM chips.

With this change, there are no more bus interrupts.

Apple's M1 integrated architecure.
Apple's M1 integrated architecure.



This design eliminates RAM bus bottlenecks, which vastly improves performance. M1 Max, for example, provides 400GB/sec of memory throughput - approaching that of modern game consoles such as Sony's PlayStation 5.

SoC integration is one of the main reasons the M1 and M2 series of CPUs are so fast - and why modern console game-level graphics are finally coming to the Mac.

It's why macOS finally feels snappy and responsive after feeling slightly rubbery for decades.

SoCs also vastly reduce power consumption, and heat, which makes them ideal for laptops, phones, tablets, and other portable devices. Less heat also means the components last longer and suffer less material degradation over time.

Heat affects system performance over time as it slowly diminishes the materials' properties contained in components, which leads to slightly lower performance. This is one of the reasons why very old computers seem to "slow down" with time, and a big cause of failures.

"Heat is the enemy of electronics" as they say in the EE world.

Apple's M1 CPU with integrated RAM.
Apple's M1 CPU with integrated RAM.

The downside of integrated memory in Apple Silicon



While Apple's SoC designs have proven to be vast improvements over its traditional designs, there are some downsides.

The first, and most obvious one is upgrades -- with the system RAM contained in the CPU itself, there's no way to upgrade the RAM later, except to replace the CPU - which, with modern Surface-Mount Device (SMD) soldering technology, you probably wouldn't want to do.

Earlier Mac models had banks of RAM DIMMs (Dual in-line Memory Modules) or memory "sticks" which could be swapped in and out for larger sizes to upgrade memory.

With Apple Silicon that option goes away since the RAM chips themselves are manufactured into the CPU. When you buy an Apple Silicon Mac, you're stuck with whatever RAM size you initially ordered.

Another downside is that if the RAM or CPU fails, the entire thing fails. There's no replacing just one part, you have to do it all.

Modern Mac motherboards are so tiny with mostly SMD components. In most cases, it's cheaper and faster just to replace the entire thing, or just buy a new Mac.

Another, and equally obvious downside to SoCs is that the use of integrated GPUs means there's no way to upgrade your Mac's graphics card later for a faster or bigger version. And with Apple dropping support for external Thunderbolt GPU expansion boxes in Apple Silicon, even external GPU expansion is no longer an option.

What all of this means, of course, is that modern Macs are becoming more and more like "appliances" than like computers as we have traditionally thought of them.

Overall, this is a good thing.

It means you're going to want to buy a new Mac every few years, but the performance improvements make that upgrade path are worth it. Compared to Apple's old Intel-based traditional architecture, Apple Silicon is a complete revolution in terms of performance.

As systems get smaller and smaller, so will devices. Laptops will become thinner and lighter, and battery life will continue to improve - even as performance improves over time.

In a few years' time, there's no doubt Apple will have advanced Apple Silicon far enough that a new Mac will make the cost worth it. Time is money, and the amount of work you can get done on modern Macs using Apple Silicon far outweighs the cost of the upgrading.

You can read more about the technical details of Apple Silicon on Apple's developer website.

Read on AppleInsider

williamlondon
«134

Comments

  • Reply 1 of 68
    lam92103lam92103 Posts: 126member
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    grandact73xyzzy-xxxwilliamlondon
  • Reply 2 of 68
    hmlongcohmlongco Posts: 537member
    Misses the other major benefit of SOC integrated memory. In a traditional system if I want to move an image from memory to a graphics card that image data has to be copied from the CPU's RAM to the RAM on the GPU, byte by byte. Similarly, if I want the GPU to perform some action on that image and return it then the result needs to be copied once more from the GPU back to the CPU.

    In Apple's SOC design, you do little more than hand the address of the data in RAM to the GPU, which then can perform the operation in place. You get tremendous gains in throughput when you don't have to copy data back and forth.
    XedOferauxioforegoneconclusionMisterKitdanoxbaconstangjony0Alex1NFileMakerFeller
  • Reply 3 of 68
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    Yes.  
    grandact73danoxStrangeDayswilliamlondontenthousandthings
  • Reply 4 of 68
    hmurchisonhmurchison Posts: 12,425member
    I see a lot of this stuff as a stopgap solution.  UMA is nice but the industry realizes that no single memory solution is going to cover all bases 

    It appears that PCI-Express 6 will be amongst the first to really start to user in CXL  Anandtech has a good write up : 

    https://www.anandtech.com/show/17520/compute-express-link-cxl-30-announced-doubled-speeds-and-flexible-fabrics 

    I've seen too much kvetching about "Who the Mac Pro is for?"  which is understandable but moving to the next generation 
    of  memory types isn't being taken lightly and the basic question is "why should a next generation computing platform have to choose?" 

    The system should be able to leverage volatile, persistent,  DRAM level speeds and RAM level speeds without the application or the end user 
    fiddling.  

    In 5 years you shouldn't have to ponder about how much memory you need to purchase as the science should make the question relatively obsolete 
    you should be able to extend the memory albeit at perhaps slightly slower speeds.   


    xyzzy-xxxwilliamlondonAlex1N
  • Reply 5 of 68
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    Yes.  
    that’s what the whole article is about. That “somehow “. Reread the article, and it explains why that type of memory is not user upgradable. That’s the whole point. It’s much more  efficient. The GPU on an Apple computer can easily have access to over 128 GB of ram. How much would a standalone GPU card with 128 GB on the card itself cost for your PC? do they even make that?
    danoxtwolf2919williamlondonmacxpressbaconstangAlex1NFileMakerFellerkillroywatto_cobrajony0
  • Reply 6 of 68
    DuhSesameDuhSesame Posts: 1,278member
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    Yes.  Mac users are fine with it.

    don’t hurt to buy both.  Don’t force each to be others.
    auxiowilliamlondonthtkillroywatto_cobra
  • Reply 7 of 68
    auxioauxio Posts: 2,728member
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    And this really gets at the core of the mindset of PC users who have an irrational hatred of Apple: we do it this way, why does Apple think they're special?

    That mindset typically carries into other areas of life too: why doesn't everyone speak the same language, worship the same god, look the same, etc, etc. They feel this need for everything to be the same, and for some reason want to force everything to be that way. I'd really like to know the reason why, because for myself, diversity is what makes life interesting. And in both nature and technology, it's proven to have great benefits for survival and progress.
    folk fountaindanoxwilliamlondonmacxpressbaconstangmjtomlinjony0Alex1NFileMakerFeller9secondkox2
  • Reply 8 of 68
    mfrydmfryd Posts: 216member
    I see a lot of this stuff as a stopgap solution.  UMA is nice but the industry realizes that no single memory solution is going to cover all bases 

    ...


    Yes.  No single solution will serve all users.   Apple is not trying to sell the Mac as a solution for every possible situation.   Apple is targeting the huge mainstream part of the computer market.   The vast majority of computer owners never upgrade components.  They stick with the original RAM/storage, and eventually just replace the whole computer.


    Apple is not going after the low end of the market.  Nor is Apple going after the ultra high end where users really do need terabytes of RAM, or multiple high performance GPUs.    

    Think of the car market.   Most car owners stick with the original rims, suspension, radio, radiator, engine, etc.   Even though many cars can be upgraded with performance boosting chips, fancier rims, even more powerful engines.   Most car owners are not interested in these sorts of upgrades, and will keep the car as they bought it, until they replace it.

    The Mac is not the best solution for everyone.  However, Apple is trying to make it the best solution for most.

    williamlondonbaconstangAlex1NFileMakerFellerdewmekillroywatto_cobra
  • Reply 9 of 68
    goofy1958goofy1958 Posts: 165member
    I've been a Windows PC user my whole life until I got a MacBook from my company to use for a year.  I fell in love with it, so my next PC will be the new MacBook Air 15" with maxed out specs on memory and storage.  It will be more than enough computer for my needs for the foreseeable future. I already have an iPhone, Airpods Pro, Apple TV 4k, and a pair of Homepods, so will be nice to round out my Apple collection.
    Alex1NFileMakerFellerdewmekillroytenthousandthingswatto_cobra
  • Reply 10 of 68
    melgrossmelgross Posts: 33,510member
    Ok, so the writer gets it wrong, as so many others have when it comes to the M series RAM packaging. One would think that’s this simple thing would be well understood by now. So let me make it very clear - the RAM is NOT on the chip. It is NOT “in the CPU itself”. As we should all know by now, it’s in two packages soldered to the substrate, which is the small board the the SoC is itself soldered to. The lines from Apple’s fabric, which everything on the chip is connected with, extend to that substrate, to the RAM chips. Therefore, the RAM chips are separate from the SoC, and certainly not in the CPU itself. As we also know, Apple offers several different levels of RAM for each M series they sell. That means that there is no limit to their ability to decide how much RAM they can offer, up to the number of memory lines that can be brought out. This is no different from any traditional computer. Every CPU and memory controller has a limit as to how much RAM can be used. So, it seems to me that Apple could, if it wanted to, have sockets for those RAM packages, which add no latency, and would allow exchangeable RAM packages. Apple would just have to extend the maximum number of memory lines out to the socket. How many would get used would depend on the amount of RAM in the package. That’s nothing new. That’s how it’s done. Yes, under that scheme you would have to remove a smaller RAM package when getting a larger one, but that's also normal. The iMac had limited RAM slots and we used to do that all the time. Apple could also add an extra two sockets, in addition to the RAM that comes with the machine. So possibly there would be two packages soldered to the substrate, and two more sockets for RAM expansion. Remember that Apple sometimes does something a specific way, not because that’s the way it has to be done, but because they decided that this was the way they were going to do it. We don’t know where Apple is going with this in the future. It’s possible that the M2, which is really just a bump from the M1, is something to fill in the time while we’re waiting for the M3, which with the 3nm process it’s being built on, is expected to be more than just another bump in performance. Perhaps an extended RAM capability is part of that.
    baconstangjony0Alex1NFileMakerFellerthtkillroymuthuk_vanalingamwatto_cobra
  • Reply 11 of 68
    mfrydmfryd Posts: 216member
    melgross said:
    Ok, so the writer gets it wrong, as so many others have when it comes to the M series RAM packaging. One would think that’s this simple thing would be well understood by now. So let me make it very clear - the RAM is NOT on the chip. It is NOT “in the CPU itself”. As we should all know by now, it’s in two packages soldered to the substrate, which is the small board the the SoC is itself soldered to. The lines from Apple’s fabric, which everything on the chip is connected with, extend to that substrate, to the RAM chips. Therefore, the RAM chips are separate from the SoC, and certainly not in the CPU itself. As we also know, Apple offers several different levels of RAM for each M series they sell. That means that there is no limit to their ability to decide how much RAM they can offer, up to the number of memory lines that can be brought out. This is no different from any traditional computer. Every CPU and memory controller has a limit as to how much RAM can be used. So, it seems to me that Apple could, if it wanted to, have sockets for those RAM packages, which add no latency, and would allow exchangeable RAM packages. Apple would just have to extend the maximum number of memory lines out to the socket. How many would get used would depend on the amount of RAM in the package. That’s nothing new. That’s how it’s done. Yes, under that scheme you would have to remove a smaller RAM package when getting a larger one, but that's also normal. The iMac had limited RAM slots and we used to do that all the time. Apple could also add an extra two sockets, in addition to the RAM that comes with the machine. So possibly there would be two packages soldered to the substrate, and two more sockets for RAM expansion. Remember that Apple sometimes does something a specific way, not because that’s the way it has to be done, but because they decided that this was the way they were going to do it. We don’t know where Apple is going with this in the future. It’s possible that the M2, which is really just a bump from the M1, is something to fill in the time while we’re waiting for the M3, which with the 3nm process it’s being built on, is expected to be more than just another bump in performance. Perhaps an extended RAM capability is part of that.
    Actually, moving the memory further away from the CPU does add latency.  Every foot of wire adds about a nanosecond of delay.

    Then there is the issue of how many wires you run.  When the memory is physically close to the CPU you can run more wires from the memory to the CPU, this allows you to get data to/from the CPU faster.   It's not practical to run a large number of wires to a socket that might be a foot or more of cable run away.  That means you transfer less data in each clock cycle.

    Generally socketed memory is on an external bus.  This lets various peripherals directly access memory.  The bus arbitration also adds overhead.


    Traditional CPUs try to overcome these memory bottlenecks by using multiple levels of cache.  This can provide a memory bandwidth performance boost for chunks of recently accessed memory.  However, tasks that use more memory than will fit in the cache, may not benefit from these techniques.

    Apples "System on a Chip" design really does allow much higher memory bandwidth.   Socketing the memory really would reduce performance.
    DuhSesamewilliamlondonbsimpsenbaconstangjony0Alex1NMauriziodewmekillroychasm
  • Reply 12 of 68
    sflocalsflocal Posts: 6,096member
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    Just stop.
    williamlondonbaconstangjony0Alex1Nkillroychasmwatto_cobra
  • Reply 13 of 68
    danoxdanox Posts: 2,875member
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??

    Yes, and it will be fun to see the other companies come up with a solution to the Apple Vision Pro (small size soon to get smaller in time), with their traditional ways of designing a computer system, Apple is a vertical computer company, the rest, the mainstream companies that Apple is in competition with are not…….
    williamlondonbaconstangjony0Alex1Nkillroywatto_cobra
  • Reply 14 of 68
    rob53rob53 Posts: 3,253member
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    Do phones have modular RAM? Do watches have modular RAM? Do all electronic devices that use RAM have modular RAM? (All of these are considered computers or have computers inside them.)

    Nope
    williamlondonbaconstangjony0Alex1Nget seriousFileMakerFellerthtkillroychasmwatto_cobra
  • Reply 15 of 68
    StrangeDaysStrangeDays Posts: 12,886member
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    Where do I plug RAM into my iPad? iPhone?

    Yeah, not a greed thing. It’s an architecture thing. One line of extremely high performing chips for their product line is just smart. 
    edited June 2023 williamlondonbaconstangjony0Alex1Nget seriousFileMakerFellerkillroywatto_cobra
  • Reply 16 of 68
    rob53rob53 Posts: 3,253member
    This article was written by Chip Loder. First,  I want to ask whether this is a real person's name. Second, how much computer stock does this person own? 

    Also, when was the last time any of you commenters had an issue with modular RAM or the CPU? I'll only accept issues less than 10-15 years ago. Both of these components perform very well, as long as you get them from reputable manufacturers. I have my daughter' original, white iBook and it still runs. The battery might have been replaced but nothing else. 

    When you look at the speed and capability of Apple's SoC and their unified memory, it's much faster and more capable than Apple's Intel Macs. People need to be open minded with this new architecture. Also take into account the ridiculously low wattage requirements. Intel and NVIDIA devices can be used to cook steaks on.
    williamlondonbaconstangjony0Alex1Nkillroywatto_cobra
  • Reply 17 of 68
    DuhSesameDuhSesame Posts: 1,278member
    mfryd said:
    melgross said:
    Ok, so the writer gets it wrong, as so many others have when it comes to the M series RAM packaging. One would think that’s this simple thing would be well understood by now. So let me make it very clear - the RAM is NOT on the chip. It is NOT “in the CPU itself”. As we should all know by now, it’s in two packages soldered to the substrate, which is the small board the the SoC is itself soldered to. The lines from Apple’s fabric, which everything on the chip is connected with, extend to that substrate, to the RAM chips. Therefore, the RAM chips are separate from the SoC, and certainly not in the CPU itself. As we also know, Apple offers several different levels of RAM for each M series they sell. That means that there is no limit to their ability to decide how much RAM they can offer, up to the number of memory lines that can be brought out. This is no different from any traditional computer. Every CPU and memory controller has a limit as to how much RAM can be used. So, it seems to me that Apple could, if it wanted to, have sockets for those RAM packages, which add no latency, and would allow exchangeable RAM packages. Apple would just have to extend the maximum number of memory lines out to the socket. How many would get used would depend on the amount of RAM in the package. That’s nothing new. That’s how it’s done. Yes, under that scheme you would have to remove a smaller RAM package when getting a larger one, but that's also normal. The iMac had limited RAM slots and we used to do that all the time. Apple could also add an extra two sockets, in addition to the RAM that comes with the machine. So possibly there would be two packages soldered to the substrate, and two more sockets for RAM expansion. Remember that Apple sometimes does something a specific way, not because that’s the way it has to be done, but because they decided that this was the way they were going to do it. We don’t know where Apple is going with this in the future. It’s possible that the M2, which is really just a bump from the M1, is something to fill in the time while we’re waiting for the M3, which with the 3nm process it’s being built on, is expected to be more than just another bump in performance. Perhaps an extended RAM capability is part of that.
    Actually, moving the memory further away from the CPU does add latency.  Every foot of wire adds about a nanosecond of delay.

    Then there is the issue of how many wires you run.  When the memory is physically close to the CPU you can run more wires from the memory to the CPU, this allows you to get data to/from the CPU faster.   It's not practical to run a large number of wires to a socket that might be a foot or more of cable run away.  That means you transfer less data in each clock cycle.

    Generally socketed memory is on an external bus.  This lets various peripherals directly access memory.  The bus arbitration also adds overhead.


    Traditional CPUs try to overcome these memory bottlenecks by using multiple levels of cache.  This can provide a memory bandwidth performance boost for chunks of recently accessed memory.  However, tasks that use more memory than will fit in the cache, may not benefit from these techniques.

    Apples "System on a Chip" design really does allow much higher memory bandwidth.   Socketing the memory really would reduce performance.
    The real catch is that Apple’s CPU are really good at memory-intensive workloads, thus there’s a need for a high bandwidth design.

    read Anandtech’s article if anyone is interested.  The gcc performance of the M1 Max is comparable (not surpass, close) to the 12900K, and that’s 6 cores less.

    oh and of course, Apple is designing their chip with a strict thermal limit (no higher than 5W per core), whereas the 12900K is pretty much at its peak.

    I’d give up RAM sticks just for that performance.  If I want a module, I’ll build a PC.
    edited June 2023 williamlondonbaconstangAlex1NFileMakerFellerkillroywatto_cobra
  • Reply 18 of 68
    DuhSesameDuhSesame Posts: 1,278member
    The thing is, people are used to memory modules that they start to think it’s the only way.

    That’s far from the truth, and especially not when a soldered, proprietary system is required to give a huge performance boost.

    The biggest strength from wintel is they’re standardized, but for every standard, they’re limiting themselves.  Intel does want something like Apple’s UMA, but imagine building a PC like that.  No customer will be happy.
    edited June 2023 jony0Alex1NFileMakerFellerkillroywatto_cobra
  • Reply 19 of 68
    auxioauxio Posts: 2,728member
    DuhSesame said:

    The biggest strength from wintel is they’re standardized, but for every standard, they’re limiting themselves.  Intel does want something like Apple’s UMA, but imagine building a PC like that.  No customer will be happy.
    The biggest strength from Wintel is that they maintain backwards compatibility for a very long time (software written for Windows 30 years ago often still runs). This is why enterprise loves them, but it's also what keeps the platform from being used anywhere but traditional PCs/laptops/servers. They're basically the Big Blue of this generation.
    baconstangFileMakerFellerdanoxwatto_cobra
  • Reply 20 of 68
    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??
    That was my initial thought as well, as I always bought the highest performing Mac with the least amount of RAM and upgraded it myself.  As I've gotten MacBooks for my wife and kids, now I just add that into the cost, and since these things last them 5-6 years, the cost per year is much smaller.

    I say that on my latest Mac I've bought myself:  A 2014 5K Retina with 24GB of RAM that is on its last legs.
    baconstangAlex1NFileMakerFellerwatto_cobra
Sign In or Register to comment.