Gaming and AI are in Mac's future, even with low memory capacities

2»

Comments

  • Reply 21 of 30
    auxioauxio Posts: 2,730member

    jdw said:
    Sorry, but all the defenses of Apple on this are total poppycock.  I'm a huge defender of Apple, and have been since 1984.  People who go around doing nothing but trashing Apple are trash themselves.  But let's face it, folks, RAM is one area Apple has ALWAYS failed the consumer on.  Apple Silicon and RAM efficiency talk is garbage.  What matters is PRACTICAL USABILITY.  Does somebody online talking about efficiency magically make 8GB good enough for YOU?  No.  No, it does not.
    I just got into the technical details, but hey, stick with the "magical" thinking because you can't comprehend the actual engineering details.

    The big problem is that every application you use is web, web, web these days. Which adds a bunch of extra overhead (JavaScript runtime) and doesn't allow for proper hardware optimization like the RAM/VRAM optimization I described above. The Google heads love you to be on the web using Chrome for everything because that's where they make their money (tracking everything you do). The second rate software development companies love it too because it saves them on development costs not needing to hire people who truly understand how operating systems and hardware work (so that things can be properly optimized for different systems). And unfortunately it leads to all the bloat and overhead which requires more memory and CPU power.

    I guarantee that if you can stick to native applications outside of a web browser, you'll find you'll rarely (if ever) hit the limits of the average computer unless you're doing very specialized things like software engineering, scientific data analysis, CAD, video editing, etc. And if you're doing those, you should be looking at the pro models which have higher base specifications.
    edited April 14 williamlondon
  • Reply 22 of 30
    elijahgelijahg Posts: 2,764member
    auxio said:
    elijahg said:
    auxio said:
    Unified memory IS more efficient than DDR. There’s no 1x1 comparison between the two. 
    DDR (double data rate) is just the data throughput of RAM, which doesn't say anything about how much memory is required by applications.

    However, the point about unified memory being more efficient (apps require less overall RAM) is correct. Without it apps which need to, for example, display an image on the screen need to store a copy of that image in both CPU memory (RAM) and GPU memory (VRAM). With unified memory they only need one copy because both the CPU and GPU can access the same memory. The same holds true for machine learning and the NPU (neural processing unit).

    All that said, people without critical thinking skills (i.e. the majority of the population) simply follow the "bigger is better" logic. And so if Apple hopes to sell to such people, they'll have to bump the specs, even if those people will never need that extra memory.
    The second half is not true. The same image data does not need to be in RAM and VRAM.
    Please enlighten me as to how the GPU can load an image from disk directly into VRAM when it has no ability to access the hard drive (GPU is just specialized graphics compute power with no device I/O capabilities).

    Sure, once the image has been loaded from disk by the CPU into RAM and then copied to a texture/buffer in VRAM, it can be purged from RAM. But that doesn't negate the fact that, at image load time, you'll have 2 copies of the image in memory (and thus require double the memory) unless the GPU and CPU can share the same memory (i.e. unified memory).

    I could go on about how, if you use a lot of images, you'll need to cache some in RAM since you typically have more RAM than VRAM (or you need VRAM for other things like vertex buffers, frame buffers, etc) but I think the original example is a good, simplified explanation. It's a similar situation with AI/ML and the NPU.

    The GPU is on the PCIE bus. What else is on the PCIE bus? Ah yes, the SSD. https://docs.nvidia.com/gpudirect-storage/overview-guide/index.html

    No, you don't need to the entire image into RAM before transfer to the GPU. It can be loaded n bytes at a time.

    I just got into the technical details, but hey, stick with the "magical" thinking because you can't comprehend the actual engineering details.

    Maybe you should stick with magical thinking instead, since apparently you cannot do basic research.

    I guarantee that if you can stick to native applications outside of a web browser, you'll find you'll rarely (if ever) hit the limits of the average computer unless you're doing very specialized things like software engineering, scientific data analysis, CAD, video editing, etc. And if you're doing those, you should be looking at the pro models which have higher base specifications.

    You can "guarantee" all you want, which is completely worthless because there are plenty of situations where 8GB is not enough in non-specialised situations. What about games? That is none too specialised and uses all the resources available. And why dismiss the web browser? That is the most commonly used program and that regularly eats all the 8GB available. Someone producing a large document in Pages or Word, with Safari open for research? Eats all of the 8GB. I have Maps running and it is using 1GB alone. The windowserver (this is with a dedicated GPU, so no RAM sharing) is using 1.6GB. I have 24GB and the Mac is using 1.8GB swap. Imagine that with 8GB. Just the message above W.R.T direct storage access proves you know less about this stuff than you like to think, sorry.

    edited April 14
  • Reply 23 of 30
    jdwjdw Posts: 1,349member
    auxio said:

    jdw said:
    Sorry, but all the defenses of Apple on this are total poppycock.  I'm a huge defender of Apple, and have been since 1984.  People who go around doing nothing but trashing Apple are trash themselves.  But let's face it, folks, RAM is one area Apple has ALWAYS failed the consumer on.  Apple Silicon and RAM efficiency talk is garbage.  What matters is PRACTICAL USABILITY.  Does somebody online talking about efficiency magically make 8GB good enough for YOU?  No.  No, it does not.
    ...hey, stick with the "magical" thinking because you can't comprehend the actual engineering details.

    The big problem is that every application you use is web, web, web these days. ...it leads to all the bloat and overhead which requires more memory and CPU power.

    I guarantee that if you can stick to native applications outside of a web browser, you'll find you'll rarely (if ever) hit the limits of the average computer...
    You, sir, are the very type of person my previous post addresses.  You (1) trashed me by saying I cannot comprehend engineering details, (2) you somewhat admit the problem is not me but rather software requiring something beyond my control which is amusing because you just trashed me, and (3) you then give me a totally worthless guarantee.

    Folks, as I said in my previous post, people like "auxio" are spewing poppycock.  My understanding of engineering details or lack thereof does NOT magically make my daughter stop calling me for support of her 8GB RAM M1 MBP because "magically' after reading the magical post by auxio, she now has a usable machine.  No, it's not a matter of "she's holding it wrong!"  It's only after talking to me and hearing my advice about running only a single app at a time, like we did in 1984, does she have a usable machine.  And no, she's not running Chome or "web, web, web" either.  She DOES indeed "hit the limits of the average BASELINE RAM computer from Apple because that RAM is only 8GB."  ENGINEERING DETAILS matter nothing.  What matters is PRACTICAL USABILITY FOR THE USER!  If you are satisfied with 8GB of RAM, great for YOU!  But your great experience doesn't magically translate into a great experience for others.  And to suggest we "hold it like you're holding it" (a throwback to the iPhone 4 days), is ridiculous.

    We should all be able to buy the baseline RAM and STORAGE and get practical usability out of the machine for at least a year without running up against out of memory errors, serious slowdowns or the inability to run apps.  And to say the consumer is the one in the wrong because they didn't pay extra (above their budget) for overpriced extra RAM and overpriced extra STORAGE is, as I said in my earlier post, nothing more than a Cupertino-worshipper talking, not your friend.

    A Cupertino-worshipper is a largely mindless person who loves Apple and who preaches whatever Cupertino preaches at any given time.  To the worshipper, it's Gospel.  So if Apple is preaching butterfly keyboards, a Cupertino-worshipper defends that choice and bashes anyone who doesn't love butterfly key switches, playing down all problems associated with the tech.  When Apple removed the SD card slot, those same worshippers ran in droves to this very forum to defended that stupid choice.  Then when Apple changed out the keyboard and restored the slot, those worshippers fell silent on those issues, but they now have come roaring back defending 8GB of baseline RAM.  

    Cupertino-worshippers are crazy, and sadly, they are perpetually founding in forums like this one.  So just be on your guard when you come across them.  They are in greater numbers than you think.   And even though they are "crazy ones," they do not change the world, as the Apple slogan goes.  If anything, they fight against change, and that's the very problem that keeps 8GB as the baseline RAM in current Macs, even now in 2024.


    VictorMortimerelijahgmuthuk_vanalingamavon b7
  • Reply 24 of 30
    dewmedewme Posts: 5,391member
    danvm said:
    elijahg said:

    loopless said:
    It is NOT BS.  Unified memory is a huge advantage.

    I have a 16GB 14" M1 MacBook Pro, and a  Dell 32GB Windows 11 Core I7 laptop. Both with SSD's.  I use them for software development.
    The Windows 11 machine is bumping up against its memory limits (at which point the performance tanks)  earlier than I have problems with the MacBook when doing a similar set of tasks. For example, using QT Creator and Visual Code, then building large code bases and with lots of other apps open at the same time. 
    And lets not talk about the various "blue screens" that still seem to plague Windows.
    I looked at upgrading the Dell's memory  but it has CAMM memory that costs $1000 to upgrade - so don't be complaining about Apples prices!
    Windows is hideously inefficient with RAM. Doesn't excuse Apple from still only supplying 8GB as standard though. If you need a VM for example,  that will eat all of the 8GB straight up. 
    I know that Windows and macOS works differently, but I never seen a test where it shows that Windows is "hideously inefficient with RAM". At least in my customers working with heavy loads, they had no issues at all with memory management in Windows.  But maybe you had a different experience.  
    I'm in agreement regarding the efficiency Windows memory management. I've worked on several industrial applications using Windows as clients and servers that had to stay up and running for extremely long periods of time. The main issues that I've encountered with Windows memory usage have been (mostly multithreaded) programming errors in apps, user mode services, and kernel services. Windows memory management can't save you from things like memory leaks or stupid exception handling but Windows will try to keep everything running up until all physical memory is exhausted and too much reliance on virtual memory brings everything to a screeching halt. 

    My main concerns around memory capacity are more along the lines of the number of threads and their working sets that have to be managed concurrently. Even a machine with a fairly small amount of physical memory can run one or two user mode applications fine if their working sets and number of concurrent threads is not outrageous. Every thread gets a fairly large dedicated block of virtual memory address space so having a lot of threads chews up a lot of virtual memory address space that has to be moved in and out of physical memory with every thread context switch. All of these kernel level features have been part of Windows for decades. Having more execution cores helps with thread concurrency, but it's still all tied to a single source of physical memory. Multiple cores place more demands of memory management, not only by having more consumers of memory but the necessity to keep the shared caches coherent. 

    The biggest changes I've seen over the last few decades is that everything has gotten so fast and large that application programmers (as opposed to system programmers) don't have to think as much about memory management at the programming level. Many of the newer programming languages like C# and Swift provide a very abstracted perspective of memory management in code to the point where there's little motivation for programmers to be concerned about memory usage, at least outside of embedded and kernel level code. These abstractions have reduced much of the need for programmers to worry about memory housekeeping and hygiene and do provide better programmer productivity, but at the expense of efficiency, some performance, and memory bloat. The compilers do a pretty good job of making things better, but the basic constructs that were used to prevent bloat, like passing by reference rather than copying big data structures, are no longer a concern of programmers. These beefy processors can massive amount of memory can handle the new way of doing things. Additionally, some of the things that were done for efficiency also introduced a need for higher levels of programmer memory awareness, a lack of which lead to catastrophic crashes.
  • Reply 25 of 30
    blastdoorblastdoor Posts: 3,320member
    I agree that Apple needs more RAM in the base configuration for AI and games. Otherwise, I agree with Apple that 8GB is enough for many light workloads.

    Perhaps with the M4 we will finally see the base configuration go up to 16. 
    elijahg
  • Reply 26 of 30
    MesonMeson Posts: 7member
    jdw said:
    People who go around doing nothing but trashing Apple are trash themselves.  
    So basically anyone who doesn't share your opinion are trash. Got it!
  • Reply 27 of 30
    danvmdanvm Posts: 1,413member
    dewme said:
    danvm said:
    elijahg said:

    loopless said:
    It is NOT BS.  Unified memory is a huge advantage.

    I have a 16GB 14" M1 MacBook Pro, and a  Dell 32GB Windows 11 Core I7 laptop. Both with SSD's.  I use them for software development.
    The Windows 11 machine is bumping up against its memory limits (at which point the performance tanks)  earlier than I have problems with the MacBook when doing a similar set of tasks. For example, using QT Creator and Visual Code, then building large code bases and with lots of other apps open at the same time. 
    And lets not talk about the various "blue screens" that still seem to plague Windows.
    I looked at upgrading the Dell's memory  but it has CAMM memory that costs $1000 to upgrade - so don't be complaining about Apples prices!
    Windows is hideously inefficient with RAM. Doesn't excuse Apple from still only supplying 8GB as standard though. If you need a VM for example,  that will eat all of the 8GB straight up. 
    I know that Windows and macOS works differently, but I never seen a test where it shows that Windows is "hideously inefficient with RAM". At least in my customers working with heavy loads, they had no issues at all with memory management in Windows.  But maybe you had a different experience.  
    I'm in agreement regarding the efficiency Windows memory management. I've worked on several industrial applications using Windows as clients and servers that had to stay up and running for extremely long periods of time. The main issues that I've encountered with Windows memory usage have been (mostly multithreaded) programming errors in apps, user mode services, and kernel services. Windows memory management can't save you from things like memory leaks or stupid exception handling but Windows will try to keep everything running up until all physical memory is exhausted and too much reliance on virtual memory brings everything to a screeching halt. 
    Your comment is in line with what MS describe in this link,
    Preventing Memory Leaks in Windows Applications - Win32 apps | Microsoft Learn

    I couldn't find an article on how macOS manage memory leaks.  What I found was many cases of memory leaks in macOS, resulting in performance issues and memory errors. It looks like macOS have issues managing memory leaks too. From what I have seen, both environments manage memory different.  At the same time, I cannot find and article that points out that Windows is "hideously inefficient with RAM".
  • Reply 28 of 30
    StrangeDaysStrangeDays Posts: 12,891member
    Of course we all know this is BS. The real reason is that Tim Cook wants us to climb the spec ladder. A memory upgrade is probably a better choice versus upgrading cores or even moving from a Pro to Max.
    Sweet take. Where are you engineering hardware my man?
    williamlondon
  • Reply 29 of 30
    mattinozmattinoz Posts: 2,331member
    blastdoor said:
    I agree that Apple needs more RAM in the base configuration for AI and games. Otherwise, I agree with Apple that 8GB is enough for many light workloads.

    Perhaps with the M4 we will finally see the base configuration go up to 16. 
    Then what is the problem with the current set-up?  

    Machines geared to light load Mx start at 8GB but have 16GB models in stock for those who have memory-constrained workloads, the MxPro+ machines are +16Gb as far as I can see. Sure should they be charging no more than $100 per 8GB?

    That is a different question but one I'd agree with given how international pricing flows out from those US price points. 


  • Reply 30 of 30
    CheeseFreezeCheeseFreeze Posts: 1,254member
    Johar said:
    Tim Cook is a bean counter who seems to really dislike AAA games, while favoring Apple Arcade fluff. A whole generation of gamer kids have been lost for Apple due to his deliberate choice to snub AAA game developers.
    This right here! I’ve seen both my kids grow up and move from iPads to Windows PC’s and PS5, because they can’t play games on a Mac. 
    Apple Arcade and Mac App Store for desktop is primary filled with ported mobile games and it’s just not what they want. 
    If Apple would focus on acquiring a few good gaming studios and make exclusive AAA content (they are also making exclusive Apple TV content so why not?), reposition their services accordingly and make these easy hardware changes, they would reel in kids as their brand ambassadors. 
Sign In or Register to comment.