mjtomlin

About

Username
mjtomlin
Joined
Visits
192
Last Active
Roles
member
Points
4,861
Badges
2
Posts
2,699
  • Why Apple uses integrated memory in Apple Silicon -- and why it's both good and bad

    melgross said:
    Ok, so the writer gets it wrong, as so many others have when it comes to the M series RAM packaging. One would think that’s this simple thing would be well understood by now. So let me make it very clear - the RAM is NOT on the chip. It is NOT “in the CPU itself”. As we should all know by now, it’s in two packages soldered to the substrate, which is the small board the the SoC is itself soldered to. The lines from Apple’s fabric, which everything on the chip is connected with, extend to that substrate, to the RAM chips. Therefore, the RAM chips are separate from the SoC, and certainly not in the CPU itself. As we also know, Apple offers several different levels of RAM for each M series they sell. That means that there is no limit to their ability to decide how much RAM they can offer, up to the number of memory lines that can be brought out. This is no different from any traditional computer. Every CPU and memory controller has a limit as to how much RAM can be used. So, it seems to me that Apple could, if it wanted to, have sockets for those RAM packages, which add no latency, and would allow exchangeable RAM packages. Apple would just have to extend the maximum number of memory lines out to the socket. How many would get used would depend on the amount of RAM in the package. That’s nothing new. That’s how it’s done. Yes, under that scheme you would have to remove a smaller RAM package when getting a larger one, but that's also normal. The iMac had limited RAM slots and we used to do that all the time. Apple could also add an extra two sockets, in addition to the RAM that comes with the machine. So possibly there would be two packages soldered to the substrate, and two more sockets for RAM expansion. Remember that Apple sometimes does something a specific way, not because that’s the way it has to be done, but because they decided that this was the way they were going to do it. We don’t know where Apple is going with this in the future. It’s possible that the M2, which is really just a bump from the M1, is something to fill in the time while we’re waiting for the M3, which with the 3nm process it’s being built on, is expected to be more than just another bump in performance. Perhaps an extended RAM capability is part of that.

    Yes. It is common knowledge that RAM is not part of the actual SoC, it's on package with the SoC. People use the term "SoC" to describe the whole part ("M1", "M2") which includes the RAM.

    Being able to control how much RAM is installed allows Apple to guarantee that all memory channels are filled and being utilized, maximizing performance.
    Alex1Nwilliamlondonkillroywatto_cobra
  • Why Apple uses integrated memory in Apple Silicon -- and why it's both good and bad

    lam92103 said:
    So every single PC or computer manufacturer can use modular RAM. Including servers, workstations, data centers, super computers. 

    But somehow the Apple chips cannot and are trying to convince us that it is not just plain & simple greed??

    Vertical system integration doesn't necessarily equal greed. It is just another way of doing things. As the article points out there are many benefits of having a tightly integrated system as there are trade offs. It would be greedy if the systems were capable of supporting modular RAM and Apple was simply preventing it. They're not. They have stated that their design goal for these SoCs was mostly about efficiency and the best way to achieve that was by integrating everything and not supporting off SoC resources other than I/O.

    No one is losing money or gaining money from Apple's systems having UMA. If you need more memory down the road, you can simply upgrade to a new system and sell the old one to make up for much of that cost. A vast majority of people never upgrade anything in their computers. When it gets old, they throw it out and get a new one. The PC market relies on this turnaround.


    Food for thought: GPU cards have never had upgradable memory; you're stuck with what it came with. Why do you think that is?
    Alex1NFileMakerFellerwilliamlondonkillroywatto_cobra
  • Geekbench reveals M2 Ultra chip's massive performance leap in 2023 Mac Pro

    entropys said:
    If buying this kind of workstation, why on earth would I be interested in a comparison with a four year old machine? I would be comparing it with what else is currently in the market to perform similar tasks.

    Regardless, this machine is an expensive, crippled embarrassment, not able to do anything a Mac Studio can do at much lower cost. 

    It is not the Mac Pro anyone should be looking for.

    You're not Apple's target here. They're trying to get users of Intel Mac Pros to upgrade. Hence, the comparison to the previous Mac Pro. Anyone interested in spending this type of money on a system would do some research - I'd hope - and know there are Intel and AMD systems that are more powerful. But that's completely useless to someone who's workflow is based around the Mac and Apple.
    MisterKitXedwatto_cobramike1williamlondonmichelb76
  • Geekbench reveals M2 Ultra chip's massive performance leap in 2023 Mac Pro

    I would like to see a comparison where someone performs the same physics 3D simulation with a dataset larger than 192GB with a maxed out 2023 Mac Pro vs a maxed out 2019. Based on past experience I would expect the 2023 to crash once the memory usage gets close to the max.

    I'm sure there are many problems that have much larger data sets, but those are the reason supercomputers exist - and even then, that data is spread across a network of nodes. You can't expect a single desktop system to chew through that much data. The main benefit of having such a huge amount of RAM on desktop systems in the past was so you could keep all your applications and data in memory without needing to load and offload to a much slower hard disk. With SSD's that latency has dropped dramatically and isn't the bottleneck it used to be.
    watto_cobraspock1234williamlondon
  • Why Apple Vision Pro has a chance at being the future of work

    JP234 said:
    That's a way to look at it. But that's not the way Apple is currently marketing it. The WWDC promo was 95% focused on enduser entertainment. Of course, that will evolve, when millions of VP purchasers find ways to use it that Apple never even imagined.

    Apple never even mentioned future possibilities in the healthcare, military, legal or financial sectors. They also never mentioned the integration possibilities with AI. That's where the real money is going to gush like a firehose.

    WWDC is a developers conference. This is a brand new platform, the promo was mainly directed at app developers and content creators to hopefully get them on board. I'd argue they spent more time driving the "Spatial Computing" paradigm than anything else.
    dewmewilliamlondonHirsuteJim