Apple Silicon M1 Mac detection of Thunderbolt 3 eGPU gives hope for future support

13»

Comments

  • Reply 41 of 50
    melgrossmelgross Posts: 33,600member
    Supporting eGPUs here is not just a matter of writing a graphics driver. The SoC is so integrated that it’s inherently designed to work together. If you add an eGPU you are essentially ‘breaking’ this concept. 

    I completely agree. 3rd party graphics cards seem to make less sense the more and more I think about it. The Intel chip isn't the only barn burner in the thermal envelope and a third party card would make the unified memory somewhat ineffective. The only model I could see retaining 3rd party card support is the MAcPro since it has PCIe slots. If Apple was to support an external GPU it makes more sense for them to make them themselves, if they are going to replace 3rd party cards in the laptops and the iMac. Plus, they could provide rock solid driver support.

    I still suspect there will on be three chips, the M1, M1x and M1xe. It's the fastest and simplest route to completion in the next 18-months. If you do the math, a 16-core M1x should match or beat a 16-Core Xeon in the MAcPro, leaving only the 24 and 28-core models unmatched while waiting on the M1ex. Using the exact same chip in the laptops and desktops makes a lot of sense, since they can simply provide more wattage and cooling to add value for greater sustained performance. then when they get around to the M2 they'll be able to provide more tiers, like a $599 MacMini and a $899 MBA. They've had those price points before but they are always designed to up-sell.

    M1 = Low-end laptops, low-end desktops, and the Mac Mini. (4x4 cores with 8GB to 16GB)
    M1x = High-end laptops and high-end desktops (8x8 cores with 16GB to 128GB memory)
    M1xe = MacPro (???)
    Don’t forget that Apple has plenty of experience with dual socket computers. They could easily go back to a dual socket design for a Mac Pro. They wouldn’t need a true 16 core chip. And now, people keep mistakenly stating, as Apple themselves do, that the M1 is an 8 core chip, when it’s really a 4 high performance core design, with the other 4 cores together, being the equivalent of just one more core. Really, it’s a 5 core chip when thought of that way. It’s actually more impressive in multi core performance too, than when thought of as a real 8 core chip, when competing against x86 8 core chips, where all the cores are the same.

    ‘if Apple has an M1 6 high performance core chip, equal to about 7 cores, and uses two, that would give a 14 core equivalent product that performs closer to an 18-20 core x86 product. That’s assuming this generation chip performance. Next generation will be more performant.
    edited November 2020 CheeseFreeze
  • Reply 42 of 50
    melgrossmelgross Posts: 33,600member


    melgross said:
    wizard69 said:
    melgross said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
    Apple has surprised us with this, and I think that if they wanted to support external GPUs, they could do it differently. They could somehow extent the fabric support that the built-in GPUs use now to support an external GPU. They might have to buffer them which would add some minor latency, but it could work. On the other hand, they could come out with their own GPU module. We see how that tiny bit of silicon can have good performance. What if they have a small plug-in module with another 16 cores, or so, and have that plug into an extended substrate to just add to the cores now there? It could be small, contain a small heat sink of its own, and use just another 20 watts, or so.

    actually they could do this to RAM easily, as it’s on the substrate now, in two separate modules. Extend the substraate outside of the package and pop two more modules in. They’re very small.

    It is interesting that your brought up fabric support because that immediately had me thinking AMD's fabric implementation.   In the case of AMD what I'd love to see from them is a fabric interface to the CDNA architecture.   Now could Apple do something similar with an off die GPU?   They certainly could and they also have the option of adopting AMD's fabric interface.   This would be fantastic in a Mac Pro like machine, especially if a fabric interfaced CDNA chip was supplied with every machine.   You would see some delays going off chip but over all it would turn the Mac Pro into a super computer of sorts.   By the way yes I know this has little to do with the GPU side of things, however it would be in Apples best interest to go this route.   Right now the Mac Pro is embarrassingly slow when put up against an AMD Thread Ripper system.   While Apple has fast GPU's they currently can't really compete against the CDNA accelerators.
    Apparently Apple invented their own fabric implementation. We don’t know all that much about it yet, though Anandtech was able to glean some info from it. Apple of course has an accelerator for ProRez, and are working with Red to enable at least one of the two Red RAW formats. It’s completely programmable and reformable, so it could be turned into most anything with software.

    when I think about the size of the GPU section of the M1, and see how small it is, I can imagine it being replicated outside the chip in multiples. It just needs to be connected to the fabric as the GPU on the due. I can’t say it’s definitely possible, but in looking at it, it seems possible. And as I mentioned earlier, RAM is far easier as it’s already on the substrate, and connected to the fabric. There’s really no difference, other than perhaps a small latency hit. But it would be worth it for the gain in performance. The latency from the CPU, memory and GPU now, in what I’m going to think of as “classic” design principles is far worse.
    After seeing some of the real-word reviews of the M1 performing tasks like playing back 4k/8k RAW video with filters in realtime, Intel and AMD may be forced to rethink the whole PC component model. It's hard to see how they can compete when a $60 M1 can challenge an 8-core i9 with a Radeon RX 560 or better. Price wise, I don't think you can get a PC laptop like that for the price of a M1 MBP and even if you did the battery life would be horrible unless the laptop were an inch thick for a bigger battery with noisy fans to boot.

    I wouldn't be surprised if Intel and AMD copy Apple's design. BYO people are not going to like that but it's hard to any other outcome at this point.
    I agree. But let’s face it, these companies will be starting from scratch. AMD had an ARM server chip line for about two minutes, so they have some IP there, but Intel, while keeping some patents when they sold off their ARM chip department years ago, has no recent experience.

    but of course, it’s worse than that, because the ARM cores are just part of it, and maybe not the most important part. How long would it take AMD and Intel to come out with these totally different, for them, chip architectures? Years? Where will Apple be by then? 
    Detnator
  • Reply 43 of 50
    cloudguy said:
    0. Mx Macs may indeed run Windows natively ... but it will be ARM Windows which even Microsoft acknowledges is largely useless except for (some) first party and web applications. It was why the Surface Duo uses Android instead of Windows. And unless I am wrong Apple has closed the door on emulating Windows themselves. So a solution for x86 and x86-84 Windows applications would need to be via 32 and 64 bit virtualization.

    1. On the eGPU front it is not so much the hardware or even protocols. Instead, the APIs are different. When on Intel this wasn't an issue ... the APIs are the same. But with Apple Silicon it is Metal or nothing. While that is GREAT for iOS and Apple Arcade and other stuff developed for Apple using Apple provided tools, other stuff isn't going to be compatible unless third parties do it, and even if that happens Apple won't support it.

    Remember my previous rants. Apple never wanted to join the Intel platform and support their hardware and software stuff in the first place. They were PROUD of not doing so in the 80s, 90s and early 00s. They only went to Intel, put iTunes on Windows and things like that because it was necessary to survive. 

    Now Apple doesn't need that stuff anymore. Macs are going to sell regardless ... they have a very dedicated customer base especially among creatives and you can't even develop iOS, watchOS etc. apps any other way. So it is going to go back to how it was before. Apple doesn't want you running Windows. They don't want you using GPUs designed for Windows. They want you to rely on their hardware and software and have the opinion that between their first party stuff, the iPad apps coming over plus web and cloud tools, that is more than enough for anyone who wants to make it work. If you cannot or do not want to, it isn't as if Wintel is going anywhere.

    Convergence WITH their mobile hardware and apps. Divergence FROM the Windows stuff that they have always hated. And now that Microsoft is doing more stuff with Android and Samsung on Windows 10 and collaborating with Google in even more areas, even more so.

    It is going to go back to when Apple people and Windows people were different populations who use computers to do different things, which is the way that it has been for most of these companies histories. The key difference is that thanks to Apple's prowess in mobile, there are far more Apple people than there were in 1983, 1993, 2003 or 2013.

    Sigh… It's concerning the amount of misinformation you're spreading here…

    "And unless I am wrong Apple has closed the door on emulating Windows".  Indeed, wrong.

    Firstly, it's the wrong darn phrase!  There's no such thing as "emulating Windows".  You don't emulate an OS. You either virtualize or emulate, a hardware machine (computer) in software. The emulated or virtualized machine is the guest. The machine all that's running on is the host. Then in simple terms if the guest and host share the same hardware architecture, then it's virtualization; If they're different architectures, it's emulation.

    Since the Intel Macs we've had Parallels Desktop, VMWare Fusion and others. These allow for virtualizing a full computer in software. They virtualize the host (x86) architecture to allow you to make one or more virtual/guest (x86) machines running OS's and apps (macOS, Windows, and others) compiled for that Intel architecture.

    In the PowerPC (PPC) days of Macs there were apps Virtual PC that allowed x86 virtual (guest) machines running x86 compiled Windows to run on Motorola PPC host hardware (the PPC Macs of the day).  The experience was more or less the same as with Parallels and Fusion, except that the software had to translate or convert the x86 chip level instructions running in the virtual machine into PPC instructions the host could understand and execute. (It created a huge performance hit and it was impossibly slow for anything but the most basic tasks, though that detail isn't particularly relevant for this discussion).

    Now… Rosetta 2 doesn't do any of that. Rosetta 2 is not an emulator. The following is a tad oversimplified, but on a basic level it's a translator mostly between x86 compiled macOS applications and the native macOS running on the ASi architecture. There's far less if any calls to hardware through Rosetta 2. It's not emulating (nor virtualizing) anything. Part of the optimization is that a lot of that translation is done at INSTALL time not runtime. Most of the optimization is by it targeting the small range of things that it does (primarily translating apps to the OS and carefully selected as to what parts of that it does when), it does them extremely efficiently (as compared with an emulator recreating an entire computer in software for a different architecture).

    It's impossible for Rosetta 2 to "emulate Windows" (more accurately run Windows and Windows apps through x86 emulation) because Windows isn't a macOS app that runs under macOS. It's (obviously) an entirely separate operating system that requires an entire machine (virtual or physical) to run on. And then Windows apps need to talk to that.

    So terminology correction out of the way here's some further corrections to your rant:

    • Apple doesn't hate Windows enough to ever try to stop their users using it. They didn't block the virtualization (on Intel Macs) or emulation (on PPC Macs) solutions I mentioned above. They went out of their way to build Bootcamp - an APPLE solution for booting only APPLE computers into Windows. If anything they're indifferent about Windows, although even that's a stretch because Bootcamp proves they pro-actively support Windows on their Intel Macs. It's not like someone else came up with that and Apple shrugged their shoulders and said "Oh well we'll let it through and not block it."

    • Apple hasn't closed the door on Windows in any form on the ASi Macs. They simply said Rosetta 2 won't support it (for the reasons I've stated above - Rosetta 2 isn't an emulator). And they're just not proactively pursuing it, instead leaving it up to the third parties. Apple have gone on record saying that "it's up to Microsoft". Parallels and VMWare have both gone on record saying they're working closely with Apple to bring their solutions to ASi. Although... they haven't specifically committed to Windows, and I'm guessing that's because they're not sure whether they can pull it off, for two reasons, the first being because to pull off x86 Windows they'll have to figure out emulating entire x86 computers, in software, which is an altogether much more complex task than the virtualizing that their solutions currently do.

    • And the second reason is this whole thing about Windows on ARM. Those who think that ASi is just an ARM chip in a Mac instead of an Intel one just don't get it. ASi isn't just another ARM chip. It's an entirely new Apple designed and built CUSTOM architecture (not just custom CPU) where a small subset of the CPU part of it happens to include the ARM instruction set. And for Windows to run natively on ASi, either Microsoft are going to have to do a lot more than just tweak their ARM Windows for it, and/or Parallels, VMWare, etc. will most likely have to produce some combination of virtualization and emulation between ASi and "standard" ARM chips like what's in the Surface etc — and that's if MS ever gets Windows working properly on standard ARM chips anyway.

    • (Frankly I don't think Windows ARM will happen for a long time, if ever. In my opinion Windows on ASi is almost certainly going to only happen if someone successfully emulates x86 hardware on ASi, at which point anything that Intel Macs run that can't natively run on ASi Macs —  including Windows, as well as older macOS's and their apps — will run in VMs on ASi. If nothing else someone really needs to figure out x86 emulation on ASi for the sake of developers and others wanting to test on older macOS's etc., not just Windows.  I can run Mountain Lion on my Intel 16" MBP if I want - in a (virtualized) virtual machine.  I won't be able to do that on an ASi Mac without x86 emulation).

    • eGPU's. Your paragraph on that is mostly reasonable except the end: "Apple won't support it". FFS, you don't know that. Again, such aggressive negativity towards Apple… and why? It's entirely because you just don't get Apple (more on that below). eGPU compatibility (and internal discreet GPUs also) may or may not come. If they're needed, Apple very likey won't pro-actively block them (corporate issues between NVIDIA and Apple aside). But I'm willing to bet they won't come, and it'll be because they won't need to. Discrete GPUs (internal or external) will more than likely be a hindrance to ASi Macs. This conception that a GPU must be discreet for it to be performant is a fallacy. Apple has put an integrated GPU in their lowest end Macs (and even in their iPads) that keeps up with the discreet GPUs in all but their highest end pro Macs. And this is only their first iteration. So what's coming? It's not like they can't build their own extremely performant graphics Silicon - just look at the After Burner card. They've got the chops to do this. I'll bet dollars to donuts they already have integrated GPUs in M3 or M4 (or whatever they're calling them) chips in their super secret labs that will go into future Mac Pros, and will eat all but the highest end of today's and tomorrow's desktop graphics cards for lunch.

    I mentioned you "just don't get Apple (more on that below)".  Here's what I'm referring to: These wild negative rants about Apple's decisions completely misunderstand the point of Apple.  Apple are not interested in making "PCs". No, that's not about them abandoning the Mac. Apple has (almost) NEVER been interested in making "PCs". Apple has never been interested in trying to make a better alternative to Windows PCs for PC users to buy (except in the 90's and early 2000's when they nearly went broke followed by trying to get back into the game). Steve specifically said that for Apple to win Microsoft doesn't have to lose. They weren't and aren't trying to make better PCs than Windows machines. And they're not trying to make the best smartphones to pull people away from Android. You actually do get this on some level - though mostly only from the Windows and Android users' perspectives - you wrote about that in your post the other day about why all this ASi won't create a mass switch from Windows and Android to Apple. And you're right. PC people don't want Macs because Macs are not PCs. By deliberate design, they don't come with all the flexibility, hack-ability, enterprise-friendly functionality, and whatever else that Windows users have and want. And that's ok. Apple doesn't make PCs.

    So what do they make?  Apple makes appliances and devices specifically targeted to do certain specific things that their target market want to do - a limited number of things compared with what PCs and Android phones can do. But they prefer to deliver on a few things exceptionally well, rather than everything "good enough".** That's always been their gig. That's the case with the Macs, their iDevices, and almost everything else they make. It's why the HomePod isn't just another Bluetooth smart speaker. And it's why PC users won't flock to the Mac now just because they're faster. And it's why (most) Mac users don't care about eGPUs, core counts, GHz, etc.

    (** It's impossible to do both [everything exceptionally well]. You can either do a few things exceptionally well, or everything "good enough". MS/Intel choose the "everything good enough" route.  Apple chooses the "few things exceptionally well" route.)

    The fact that these devices sometimes resemble PCs and other companies' smartphones is... well ... a lot more to do with everyone usually (not always, so don't throw exceptions at me) copying Apple than Apple trying to compete in their markets. Apple created most of these markets. If you want to argue with that then tell me what else on the market looked or served the functionality of anything like the iPhone before the iPhone... Or like the iPad before the iPad. What else on the market looked like or did what the original MacBook Air did before Apple made that. Same with each iMac generation. And even at the very start:  What else on the market looked anything like or did anything like what an Apple II did before the Apple II?

    So Apple doesn't want to actively block Windows or eGPUs or anything else specifically from the Mac -- unless doing so hurts the experience for other functionality, which is why I think eGPUs won't happen but Windows will. Apple wants to build devices that serve their relatively small number of users. And they'll include or exclude whatever's going to serve that purpose best. The other companies look at and start with GHz, cores, discreet vs integrated, NVIDIA vs AMD, folding phones, NFC, keeping legacy ports, replaceable components, and all the other specs and technology, and then try to figure out what to do with it all. Then they sometimes do something useful with it, or sometimes just come up with something pretty mediocre that provides little by way of a good experience, merely just bragging rights, or if nothing else, throw it out there and see what the third party devs might do with it -- which sometimes works and sometimes doesn't.  (eg. NFC before Apple Pay).

    Apple doesn't care about any of that. And just as important neither does the vast majority of Apple's users (excepting a few vocal nerds on Apple forums). Apple looks at "what do our users want to be able to DO?" - the answer to this isn't something like "have an HDMI port for the rare few people who still might need one".  It's "play and edit raw 8K footage with multiple effects on the fly" (or whatever) and then they invent new tech, and refine and combine existing tech, as needed, to DO THE JOB. The point: why should anyone care if a GPU is discreet or integrated if it handles that footage well enough? And that's why Apple's stuff is SO much better - for the people that care about that stuff - than anything else.  (And the people that care about GHz, cores, discreet GPUs, and replaceable components - and can figure out how to do useful stuff with those things - or even just have the "do everything" flexibility, buy PCs.  And that's great for them. Yes I'm oversimplifying it but in principle it's accurate). Apple don't make PCs.

    If/when you ever really get that, you'll be right a lot more than you are in a lot of your rants about Apple today.


    (Edited for spelling, grammar and formatting).

    edited November 2020 melgrosscrowley
  • Reply 44 of 50
    melgrossmelgross Posts: 33,600member
    Detnator said:
    cloudguy said:
    0. Mx Macs may indeed run Windows natively ... but it will be ARM Windows which even Microsoft acknowledges is largely useless except for (some) first party and web applications. It was why the Surface Duo uses Android instead of Windows. And unless I am wrong Apple has closed the door on emulating Windows themselves. So a solution for x86 and x86-84 Windows applications would need to be via 32 and 64 bit virtualization.

    1. On the eGPU front it is not so much the hardware or even protocols. Instead, the APIs are different. When on Intel this wasn't an issue ... the APIs are the same. But with Apple Silicon it is Metal or nothing. While that is GREAT for iOS and Apple Arcade and other stuff developed for Apple using Apple provided tools, other stuff isn't going to be compatible unless third parties do it, and even if that happens Apple won't support it.

    Remember my previous rants. Apple never wanted to join the Intel platform and support their hardware and software stuff in the first place. They were PROUD of not doing so in the 80s, 90s and early 00s. They only went to Intel, put iTunes on Windows and things like that because it was necessary to survive. 

    Now Apple doesn't need that stuff anymore. Macs are going to sell regardless ... they have a very dedicated customer base especially among creatives and you can't even develop iOS, watchOS etc. apps any other way. So it is going to go back to how it was before. Apple doesn't want you running Windows. They don't want you using GPUs designed for Windows. They want you to rely on their hardware and software and have the opinion that between their first party stuff, the iPad apps coming over plus web and cloud tools, that is more than enough for anyone who wants to make it work. If you cannot or do not want to, it isn't as if Wintel is going anywhere.

    Convergence WITH their mobile hardware and apps. Divergence FROM the Windows stuff that they have always hated. And now that Microsoft is doing more stuff with Android and Samsung on Windows 10 and collaborating with Google in even more areas, even more so.

    It is going to go back to when Apple people and Windows people were different populations who use computers to do different things, which is the way that it has been for most of these companies histories. The key difference is that thanks to Apple's prowess in mobile, there are far more Apple people than there were in 1983, 1993, 2003 or 2013.

    Sigh… It's concerning the amount of misinformation you're spreading here…

    "And unless I am wrong Apple has closed the door on emulating Windows".  Indeed, wrong.

    Firstly, it's the wrong darn phrase!  There's no such thing as "emulating Windows".  You don't emulate an OS. You either virtualize or emulate, a hardware machine (computer) in software. The emulated or virtualized machine is the guest. The machine all that's running on is the host. Then in simple terms if the guest and host share the same hardware architecture, then it's virtualization; If they're different architectures, it's emulation.

    Since the Intel Macs we've had Parallels Desktop, VMWare Fusion and others. These allow for virtualizing a full computer in software. They virtualize the host (x86) architecture to allow you to make one or more virtual/guest (x86) machines running OS's and apps (macOS, Windows, and others) compiled for that Intel architecture.

    In the PowerPC (PPC) days of Macs there were apps Virtual PC that allowed x86 virtual (guest) machines running x86 compiled Windows to run on Motorola PPC host hardware (the PPC Macs of the day).  The experience was more or less the same as with Parallels and Fusion, except that the software had to translate or convert the x86 chip level instructions running in the virtual machine into PPC instructions the host could understand and execute. (It created a huge performance hit and it was impossibly slow for anything but the most basic tasks, though that detail isn't particularly relevant for this discussion).

    Now… Rosetta 2 doesn't do any of that. Rosetta 2 is not an emulator. The following is a tad oversimplified, but on a basic level it's a translator mostly between x86 compiled macOS applications and the native macOS running on the ASi architecture. There's far less if any calls to hardware through Rosetta 2. It's not emulating (nor virtualizing) anything. Part of the optimization is that a lot of that translation is done at INSTALL time not runtime. Most of the optimization is by it targeting the small range of things that it does (primarily translating apps to the OS and carefully selected as to what parts of that it does when), it does them extremely efficiently (as compared with an emulator recreating an entire computer in software for a different architecture).

    It's impossible for Rosetta 2 to "emulate Windows" (more accurately run Windows and Windows apps through x86 emulation) because Windows isn't a macOS app that runs under macOS. It's (obviously) an entirely separate operating system that requires an entire machine (virtual or physical) to run on. And then Windows apps need to talk to that.

    So terminology correction out of the way here's some further corrections to your rant:

    • Apple doesn't hate Windows enough to ever try to stop their users using it. They didn't block the virtualization (on Intel Macs) or emulation (on PPC Macs) solutions I mentioned above. They went out of their way to build Bootcamp - an APPLE solution for booting only APPLE computers into Windows. If anything they're indifferent about Windows, although even that's a stretch because Bootcamp proves they pro-actively support Windows on their Intel Macs. It's not like someone else came up with that and Apple shrugged their shoulders and said "Oh well we'll let it through and not block it."

    • Apple hasn't closed the door on Windows in any form on the ASi Macs. They simply said Rosetta 2 won't support it (for the reasons I've stated above - Rosetta 2 isn't an emulator). And they're just not proactively pursuing it, instead leaving it up to the third parties. Apple have gone on record saying that "it's up to Microsoft". Parallels and VMWare have both gone on record saying they're working closely with Apple to bring their solutions to ASi. Although... they haven't specifically committed to Windows, and I'm guessing that's because they're not sure whether they can pull it off, for two reasons, the first being because to pull off x86 Windows they'll have to figure out emulating entire x86 computers, in software, which is an altogether much more complex task than the virtualizing that their solutions currently do.

    • And the second reason is this whole thing about Windows on ARM. Those who think that ASi is just an ARM chip in a Mac instead of an Intel one just don't get it. ASi isn't just another ARM chip. It's an entirely new Apple designed and built CUSTOM architecture (not just custom CPU) where a small subset of the CPU part of it happens to include the ARM instruction set. And for Windows to run natively on ASi, either Microsoft are going to have to do a lot more than just tweak their ARM Windows for it, and/or Parallels, VMWare, etc. will most likely have to produce some combination of virtualization and emulation between ASi and "standard" ARM chips like what's in the Surface etc — and that's if MS ever gets Windows working properly on standard ARM chips anyway.

    • (Frankly I don't think Windows ARM will happen for a long time, if ever. In my opinion Windows on ASi is almost certainly going to only happen if someone successfully emulates x86 hardware on ASi, at which point anything that Intel Macs run that can't natively run on ASi Macs —  including Windows, as well as older macOS's and their apps — will run in VMs on ASi. If nothing else someone really needs to figure out x86 emulation on ASi for the sake of developers and others wanting to test on older macOS's etc., not just Windows.  I can run Mountain Lion on my Intel 16" MBP if I want - in a (virtualized) virtual machine.  I won't be able to do that on an ASi Mac without x86 emulation).

    • eGPU's. Your paragraph on that is mostly reasonable except the end: "Apple won't support it". FFS, you don't know that. Again, such aggressive negativity towards Apple… and why? It's entirely because you just don't get Apple (more on that below). eGPU compatibility (and internal discreet GPUs also) may or may not come. If they're needed, Apple very likey won't pro-actively block them (corporate issues between NVIDIA and Apple aside). But I'm willing to bet they won't come, and it'll be because they won't need to. Discrete GPUs (internal or external) will more than likely be a hindrance to ASi Macs. This conception that a GPU must be discreet for it to be performant is a fallacy. Apple has put an integrated GPU in their lowest end Macs (and even in their iPads) that keeps up with the discreet GPUs in all but their highest end pro Macs. And this is only their first iteration. So what's coming? It's not like they can't build their own extremely performant graphics Silicon - just look at the After Burner card. They've got the chops to do this. I'll bet dollars to donuts they already have integrated GPUs in M3 or M4 (or whatever they're calling them) chips in their super secret labs that will go into future Mac Pros, and will eat all but the highest end of today's and tomorrow's desktop graphics cards for lunch.

    I mentioned you "just don't get Apple (more on that below)".  Here's what I'm referring to: These wild negative rants about Apple's decisions completely misunderstand the point of Apple.  Apple are not interested in making "PCs". No, that's not about them abandoning the Mac. Apple has (almost) NEVER been interested in making "PCs". Apple has never been interested in trying to make a better alternative to Windows PCs for PC users to buy (except in the 90's and early 2000's when they nearly went broke followed by trying to get back into the game). Steve specifically said that for Apple to win Microsoft doesn't have to lose. They weren't and aren't trying to make better PCs than Windows machines. And they're not trying to make the best smartphones to pull people away from Android. You actually do get this on some level - though mostly only from the Windows and Android users' perspectives - you wrote about that in your post the other day about why all this ASi won't create a mass switch from Windows and Android to Apple. And you're right. PC people don't want Macs because Macs are not PCs. By deliberate design, they don't come with all the flexibility, hack-ability, enterprise-friendly functionality, and whatever else that Windows users have and want. And that's ok. Apple doesn't make PCs.

    So what do they make?  Apple makes appliances and devices specifically targeted to do certain specific things that their target market want to do - a limited number of things compared with what PCs and Android phones can do. But they prefer to deliver on a few things exceptionally well, rather than everything "good enough".** That's always been their gig. That's the case with the Macs, their iDevices, and almost everything else they make. It's why the HomePod isn't just another Bluetooth smart speaker. And it's why PC users won't flock to the Mac now just because they're faster. And it's why (most) Mac users don't care about eGPUs, core counts, GHz, etc.

    (** It's impossible to do both [everything exceptionally well]. You can either do a few things exceptionally well, or everything "good enough". MS/Intel choose the "everything good enough" route.  Apple chooses the "few things exceptionally well" route.)

    The fact that these devices sometimes resemble PCs and other companies' smartphones is... well ... a lot more to do with everyone usually (not always, so don't throw exceptions at me) copying Apple than Apple trying to compete in their markets. Apple created most of these markets. If you want to argue with that then tell me what else on the market looked or served the functionality of anything like the iPhone before the iPhone... Or like the iPad before the iPad. What else on the market looked like or did what the original MacBook Air did before Apple made that. Same with each iMac generation. And even at the very start:  What else on the market looked anything like or did anything like what an Apple II did before the Apple II?

    So Apple doesn't want to actively block Windows or eGPUs or anything else specifically from the Mac -- unless doing so hurts the experience for other functionality, which is why I think eGPUs won't happen but Windows will. Apple wants to build devices that serve their relatively small number of users. And they'll include or exclude whatever's going to serve that purpose best. The other companies look at and start with GHz, cores, discreet vs integrated, NVIDIA vs AMD, folding phones, NFC, keeping legacy ports, replaceable components, and all the other specs and technology, and then try to figure out what to do with it all. Then they sometimes do something useful with it, or sometimes just come up with something pretty mediocre that provides little by way of a good experience, merely just bragging rights, or if nothing else, throw it out there and see what the third party devs might do with it -- which sometimes works and sometimes doesn't.  (eg. NFC before Apple Pay).

    Apple doesn't care about any of that. And just as important neither does the vast majority of Apple's users (excepting a few vocal nerds on Apple forums). Apple looks at "what do our users want to be able to DO?" - the answer to this isn't something like "have an HDMI port for the rare few people who still might need one".  It's "play and edit raw 8K footage with multiple effects on the fly" (or whatever) and then they invent new tech, and refine and combine existing tech, as needed, to DO THE JOB. The point: why should anyone care if a GPU is discreet or integrated if it handles that footage well enough? And that's why Apple's stuff is SO much better - for the people that care about that stuff - than anything else.  (And the people that care about GHz, cores, discreet GPUs, and replaceable components - and can figure out how to do useful stuff with those things - or even just have the "do everything" flexibility, buy PCs.  And that's great for them. Yes I'm oversimplifying it but in principle it's accurate). Apple don't make PCs.

    If/when you ever really get that, you'll be right a lot more than you are in a lot of your rants about Apple today.


    (Edited for spelling, grammar and formatting).

    Excellent post. Thank you.
    Detnator
  • Reply 45 of 50
    melgrossmelgross Posts: 33,600member
    Just an interesting bit of info regarding our discussion about running Windows on ARM in these new machines. Don’t know where else to put this. Maybe AImwill put out an article in it shortly.

    https://www.macrumors.com/2020/11/27/developer-successfully-virtualizes-windows-on-m1/
  • Reply 46 of 50
    Like that two Thunderbolt lanes, I think this is a very good sign.  Me and my friend discussed the possibility of an Apple-designed eGPU and seems moving that way.
    There may also be Apple certified enclosures.
  • Reply 47 of 50
    wizard69 said:
    melgross said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
    Apple has surprised us with this, and I think that if they wanted to support external GPUs, they could do it differently. They could somehow extent the fabric support that the built-in GPUs use now to support an external GPU. They might have to buffer them which would add some minor latency, but it could work. On the other hand, they could come out with their own GPU module. We see how that tiny bit of silicon can have good performance. What if they have a small plug-in module with another 16 cores, or so, and have that plug into an extended substrate to just add to the cores now there? It could be small, contain a small heat sink of its own, and use just another 20 watts, or so.

    actually they could do this to RAM easily, as it’s on the substrate now, in two separate modules. Extend the substraate outside of the package and pop two more modules in. They’re very small.

    It is interesting that your brought up fabric support because that immediately had me thinking AMD's fabric implementation.   In the case of AMD what I'd love to see from them is a fabric interface to the CDNA architecture.   Now could Apple do something similar with an off die GPU?   They certainly could and they also have the option of adopting AMD's fabric interface.   This would be fantastic in a Mac Pro like machine, especially if a fabric interfaced CDNA chip was supplied with every machine.   You would see some delays going off chip but over all it would turn the Mac Pro into a super computer of sorts.   By the way yes I know this has little to do with the GPU side of things, however it would be in Apples best interest to go this route.   Right now the Mac Pro is embarrassingly slow when put up against an AMD Thread Ripper system.   While Apple has fast GPU's they currently can't really compete against the CDNA accelerators.
    Selling a product is merely just a performance war, rather it's about what you can do that others can't.  There's no argument if Ryzen barely match or excced M1 with a burning-hot surface & annoying blowers.

    It won't take 64-Core for Apple Silicon to match a TR, I can tell you that.


    As far for Intel vs. AMD:  64-core TR requires a water cooler as it exceeds 300W, while 3275M is really a 280-watt chip.  The Mac Pro already has one of the best air cooler out there and it's about 300W.

    Remember the Power Mac G5.
  • Reply 48 of 50
    TR is almost 100w hotter, designed for water cooling by start, less stable and took a year to get 2TiB of RAM support.  None of those were great quality for the Mac Pro.
  • Reply 49 of 50
    mattinozmattinoz Posts: 2,448member
    DuhSesame said:
    Like that two Thunderbolt lanes, I think this is a very good sign.  Me and my friend discussed the possibility of an Apple-designed eGPU and seems moving that way.
    There may also be Apple certified enclosures.
    Matching display set to the new range of iMacs with embedded eGPU as an option would be a very sweet addition to the line up. Could be used to make dual screen iMac or laptop set up or teamed with Mac Pro for people who don't need to the colour spec of the XDR Pro.

  • Reply 50 of 50
    No discreet GPU will ever be supported on the M-series chips, thunderbolt or PCI-express

    Apple fundamentally changed the architecture of the gfx frameworks, there is no longer a way to even reference a GPU with it's own dedicated memory anymore
    In other words, even IF AMD or Intel (or even Nvidia) wanted to make cards for the ARM kernel of Mac OS, they simply wouldn't be able to develop drivers for them

    This is one of the reasons why I am probably going to end up going back to PC, sadly

    Having the GPU as a modular component of the system makes the most sense both logistically and financially, given the fact that there are giant generational leaps in gfx tech every 2 years or so, and the fact that Apple is way behind the curve in terms of GPU features

    Being required to upgrade your entire desktop computer just to get new graphics isn't really my idea of efficiency

    It was fun while it lasted though, still running an Intel Mac Mini here, now with a 6900 XT, which is basically the best / last card that it will ever support, on Mac OS anyway

Sign In or Register to comment.