thadec
About
- Username
- thadec
- Joined
- Visits
- 18
- Last Active
- Roles
- member
- Points
- 469
- Badges
- 0
- Posts
- 97
Reactions
-
Apple insists 8GB unified memory equals 16GB regular RAM
DrDumDum said:thadec said:As someone who does a lot of virtualization (Linux and Windows virtual machines in VMWare, Parallels etc.) I can say that 8 GB of RAM on on the latest, fastest Apple CPU definitely is not analogous to 16 GB RAM on an 8th generation Core i3 from 2017.I read both posts... but didnt see anywhere where you actually owned a base M1/M2 machine to test your "theory".A lot of people here are still stuck in the "intel" mindset. I get it hard to shake. I do tech consulting for graphic designers, and over the last 15 years if I had a nickel for every time I said "theres no such thing as too much ram" id be a billionaire.Happily i dont have to anymore. I moved 15% of my older basic intel clients to M1 minis with 8/512 setups... dual screens + fast SSD external working drives... all loving the speed. those with higher demands went to 16 GB.... many came from 32 GB ram setups. No issues. Watch for refurbs. 16/512 M1 minis were $700. M2 shoudl be th4e same.
Second, what "theory"? Virtualization is when you use hypervisor software run a virtual computer on your computer hardware and OS. No matter what CPU, architecture, OS, manufacturer etc. you are talking about it is the same because the requirements of the operating system that you are running doesn't change. So you are still going to need 4 GB of RAM to run a Windows 11 VM. You are still going to need 2 GB of RAM to run a Linux VM. Period. Otherwise, your virtual computer's OS will run out of resources just as it would if you were to install the OS on physical hardware that's insufficient to handle the specs. If you don't believe me, go to ServeTheHome.com. They promote low end - meaning slow - hardware that supports a lot of RAM for virtualization servers all the time. A lot of people buy old - meaning slow - machines off Newegg and Ebay to use as virtualization servers because they have a ton of RAM. But hey, don't take my word for it. You install Windows on VirtualBox or VMWare Player and configure it to use 2 GB of RAM instead of 4 GB because "RAM on a Mac is 'effectively' twice that on a PC." See if your Windows VM even boots up.
Even better. Get a photo editor, whether Krita, GIMP, Inkscape or whatever (they are all free). Go download one of those high resolution images from NASA: https://photojournal.jpl.nasa.gov/target/Earth
Try to edit one of those bad boys with 8 GB RAM. Don't worry! Apple's executives claim that it is just as good as doing so with 16 GB RAM, so it will work out fine! -
Apple insists 8GB unified memory equals 16GB regular RAM
netrox said:Also, an interesting note that no one seems to mention is that when Apple was marketing how their MacBook Air M2 is faster than Intel laptops... if you look at how they conducted their tests ... they literally compared the baseline Macs with 8GB with 256GB SSD against PCs with 16GB RAM with 512GB SSD!
"Testing was conducted by Apple in April and May 2023 using production 13-inch MacBook Air systems with Apple M2, 8-core CPU, 8-core GPU, 8GB of RAM, and 256GB SSD, preproduction 15-inch MacBook Air systems with Apple M2, 8-core CPU, 10-core GPU, 8GB of RAM, and 256GB SSD as well as Intel Core i7-based PC systems with Intel Iris Xe Graphics, 16GB of RAM, 512GB SSD, and the latest version of Windows 11 available at the time of testing." https://www.apple.com/macbook-air-13-and-15-m2/#footnote-6netrox said:People complain about how Apple is little on RAM on iPhones when real life usage show no effect at all despite Androids needing TWICE MORE RAM to just function.
Also, it isn't merely about speed. See above: lots of "pro" programs need 8 GB - or more - of RAM just to run. Which is why a lot of the people defending this focus on stuff like web browsing, basic productivity apps like Microsoft Office plus low-to-moderate RAM tasks like front end scripting (HTML and Javascript), app development and video editing for YouTube. No one is talking about 3D animation, software development/engineering or high resolution photo editing.
Another thing: that 16 GB RAM Core i7 that you mentioned doesn't cost $1600! It actually only costs $900! Not a bargain basement made out of plastic deal either but a serious device that companies buy their workers like a Lenovo ThinkBook. No, $1600 gets you an Intel Core i9 HP Envy with an Nvidia RTX 4060 card. And if you don't mind a little plastic and a lesser known brand, it gets you an Asus Creator Q with 24 GB RAM, 1 TB SSD and an Nvidia RTX 3050 card. Personally not a fan of that RTX 3050 but you get my drift.
Finally, Intel will - to a degree - emulate Apple's unified memory using their "tiles" concept. They aired a trial balloon in a press release but pulled it real fast when they got some pushback, but it is coming. Even if it doesn't arrive in December with 14th gen you can bet they are working OEMs hard to sell the idea for 15th gen. Won't be as fast as unified memory but it will mean a real performance boost. -
Apple insists 8GB unified memory equals 16GB regular RAM
MauMan said:This sounds like a benchmarking opportunity. I'd love to see someone come up with some "Pro" workflows and run it on the various M1/M2/M3 cpu and memory configurations.
Most benchmarking doesn't run truly heavy duty applications. They just run canned tests of routine user-friendly stuff in rapid succession. Like to test graphics capability they'll run Handbrake on a short video clip. You won't see them render a 1 hour cartoon in Maya or Synfig Studio. -
Apple insists 8GB unified memory equals 16GB regular RAM
As someone who does a lot of virtualization (Linux and Windows virtual machines in VMWare, Parallels etc.) I can say that 8 GB of RAM on on the latest, fastest Apple CPU definitely is not analogous to 16 GB RAM on an 8th generation Core i3 from 2017. The M3 would run A LOT faster, but I would still only be able to run 1 Windows VM (requires 4 GB RAM) and 1 Linux VM (requires 2 GB RAM if I want to use the desktop). Meanwhile the 6 year old Core i3 chip would be as slow as molasses in comparison, but the 16 GB RAM allows me to run 2 Windows VMs and 3 Linux VMs, which lets me do load balancing, clustering, pentesting, routing, DHCP, web and email client/server and a bunch of other stuff. Meaning that for an IT professional, a 2017 MacBook Air with 16 GB RAM would do just fine. I could install VirtualBox on it and get right to work. But for me the new M3 MacBook Pro would be completely worthless. I wouldn't even be able to use it to take online bootcamp type courses at Linux Academy, Udemy or what have you. And that is just a simple example, really. The absolute truth: for lots of tasks and applications, RAM is as important as speed or even more so. There are plenty of servers in data centers - cloud data centers even - that the M2 Ultra and the M3 Max would crush in single core performance and even beat in multicore performance. The ones that perform front-end tasks for e-commerce sites are an example. But those servers have a ton of RAM. So no, you can't take one of those out of the rack and plug in a 16" MacBook Pro to replace it.
Honestly, what Apple needs to do is bring back the MacBook. Remember the 12" MacBook from 2015? Or the original MacBook from 10 years earlier? Bring back the MacBook as an entry level $700 device. Limit the entry level chip - the M4 and whatever going forward - and its 8 GB RAM to the MacBook, MacBook Air, iMac and Mac Mini. For the MacBook Pro and the iMac Pro should only have the M4 Pro, M4 Max and M4 Ultra meaning 16 GB RAM at minimum. The Mac Studio would have the M4 Max and M4 Ultra. The Mac Pro would have the M4 Ultra and M4 Extreme.
It isn't that hard, people. Apple just has to want to do it. -
New Apple Silicon has arrived with M3, M3 Pro, and M3 Max chips
I have seen in forums that the M3 still only supports 2 displays. Meaning that people are going to pay $1600 for an M3 MacBook Pro that can only support a single external monitor. Granted, the Lenovo IdeaPad Duet 3 Chromebook can only support a single external monitor, but that is because it is a $280 device that runs on a 32 bit SOC that was designed in 2017. (Moreover it is technically a tablet and not a laptop.) So can someone please explain this limitation with Apple Silicon's base chips? Whatever it is, you can bet that the Qualcomm chips in 2024 as well as the Nvidia and AMD ARM chips in 2025 aren't going to have them.