Apple Silicon M1 Mac detection of Thunderbolt 3 eGPU gives hope for future support

Posted:
in Current Mac Hardware edited November 2020
The lack of support for external GPUs in Apple Silicon M1 Macs may only be temporary, as a connected eGPU is still detected, but doesn't do anything.

A MacBook Pro with an eGPU.
A MacBook Pro with an eGPU.


The introduction of the M1-equipped Macs, including the Mac mini, MacBook Air, and 13-inch MacBook Pro promoted the on-processor GPU, but signs indicated that support for eGPUs were on the way out. Developer support documentation and later experimentation confirmed that support for non-Apple GPUs wouldn't be enabled in hardware using Apple Silicon, effectively making eGPUs largely unusable.

However, testing by Mac4Ever corroborated by AppleInsider's own trials, offers hope to eGPU owners. While eGPUs are not officially supported by the latest M1 MacBook Air, MacBook Pro, and Mac mini, macOS 11 still sees the enclosure, and the PCI-E card inside.

In plugging a Pro Display XDR into a Blackmagic eGPU inserted into a Thunderbolt 3 port, it was found the eGPU enclosure was still detected and functions. The display communicates with the MacBook Pro as normal, complete with video playback.

Additionally, AppleInsider has seen the Razer Core X, and the Sonnet eGFX Breakaway Box identify itself properly to macOS with an assortment of cards, including the RX 590, Vega 64, and Radeon VII. However, monitors connected directly the cards do nothing.

An M1 MacBook Pro detects the eGPU and the Pro Display XDR fine, but doesn't use GPU acceleration [via Mac4ever]
An M1 MacBook Pro detects the eGPU and the Pro Display XDR fine, but doesn't use GPU acceleration [via Mac4ever]


The Radeon 580 GPU included in the above enclosure isn't actively being used for graphics acceleration, nor are any of the cards we've been testing. A lack of drivers is the most obvious reason for the lack of support, but oddities in how Apple has implemented Thunderbolt or external PCI-E addressing may also be contributing factors.

This discovery leads to the possibility that Apple may reintroduce eGPU support in the future. It would have to arrive as an update to macOS Big Sur, a new M-series chip launch, or possibly both.

Apple's support for eGPUs came with a spring update to macOS Mojave -- with a complete lack of support for Nvidia cards. The core technology functions properly in macOS Big Sur with an assortment of Intel Macs.
«13

Comments

  • Reply 1 of 50
    lkrupplkrupp Posts: 10,557member
    Interesting how first generation hardware features are assumed to be etched in stone and immutable. It couldn’t possibly be that Apple wanted to get these entry level machines in the hands of users as son as possible to gauge usage and performance  No, no, assume the worst and start ranting and raving about how ALL ASi (current and future) Macs are useless pieces of garbage because these ENTRY LEVEL machines can’t run Windows natively and don't support eGPU.  It’s happening on all the major Apple centric tech blogs. 

    I’m always ranting myself about how negativity and assuming the worst rules these forums.  No room for positive thinking about the future of these machines and what the next generation might bring. Nope, the first ASi machines don't do this and don't support that so they are DOA.
    twokatmewRayz2016mattinozwilliamlondonDavid H DennisDetnatorwatto_cobra
  • Reply 2 of 50
    dewmedewme Posts: 5,362member
    My guess is that the TB spec provides a device enumeration scheme to identify all connected devices on the TB bus, whether or not they support graphics acceleration, and that GPU cards support a “pass through” mode to bypass the acceleration processing. There is likely some sort of handshake that needs to take place to allow the Mac and GPU to agree to use accelerated graphics processing, in addition to the cpu directing its video output to the GPU with the proper format and timing. Apple either hasn’t implemented this in M1 or hasn’t activated it. Hopefully the silicon has the required hardware and microcode and Apple can turn it on later. I’m not holding my breath but hope springs eternal.
    edited November 2020
  • Reply 3 of 50
    crowleycrowley Posts: 10,453member
    lkrupp said:
    Interesting how first generation hardware features are assumed to be etched in stone and immutable. It couldn’t possibly be that Apple wanted to get these entry level machines in the hands of users as son as possible to gauge usage and performance  No, no, assume the worst and start ranting and raving about how ALL ASi (current and future) Macs are useless pieces of garbage because these ENTRY LEVEL machines can’t run Windows natively and don't support eGPU.  It’s happening on all the major Apple centric tech blogs. 

    I’m always ranting myself about how negativity and assuming the worst rules these forums.  No room for positive thinking about the future of these machines and what the next generation might bring. Nope, the first ASi machines don't do this and don't support that so they are DOA.
    Not sure why you thought it would be useful to bring that negativity here. No one was ranting.
    CloudTalkinmuthuk_vanalingamwilliamlondonelijahgblastdoor
  • Reply 4 of 50
    My guess is that Apple may not plan to offer that to entry level machines or as mentioned not dealt with tat in M1 so may introduce it in M2 or 3 or my guess is if they are going to make own dedicated graphic chip then them may make own eGPU. 
    True can be that M2 can leap other low and midrange cards so and allowing eGPU could sink sale of remaining intel Macs that will be low anyway I guess.
    Just ma 2 cents.
  • Reply 5 of 50
    netroxnetrox Posts: 1,421member
    This is so hilarious. It means absolutely nothing. All TB and USB devices get detected if they are connected. It doesn't mean it's gonna do anything. 
    commentzillablastdoor
  • Reply 6 of 50
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    netrox said:
    This is so hilarious. It means absolutely nothing. All TB and USB devices get detected if they are connected. It doesn't mean it's gonna do anything. 
    The bolded is an over-simplification of the handshake and identification process. Devices have to handshake properly to get this identification, and they can't always do that if the operating system or kernel in question doesn't have "hooks" for it. The fact that it's identifying properly in the chain is a positive.

    So, while I agree that it may not ultimately result in anything, the statement that "it means absolutely nothing" isn't true.
    edited November 2020 muthuk_vanalingamOctoMonkeyroundaboutnow
  • Reply 7 of 50
    mjtomlinmjtomlin Posts: 2,673member
    lkrupp said:
    Interesting how first generation hardware features are assumed to be etched in stone and immutable. It couldn’t possibly be that Apple wanted to get these entry level machines in the hands of users as son as possible to gauge usage and performance  No, no, assume the worst and start ranting and raving about how ALL ASi (current and future) Macs are useless pieces of garbage because these ENTRY LEVEL machines can’t run Windows natively and don't support eGPU.  It’s happening on all the major Apple centric tech blogs.

    Well, didn't we hear the same ranting and ravings over Apple dropping support for Thunderbolt because the DTK didn't support it?

    Developer support documentation and later experimentation confirmed that support for non-Apple GPUs wouldn't be enabled in hardware using Apple Silicon, effectively making eGPUs largely unusable.

    Yeah, not completely true. The developer documentation does not say discreet (or external) GPUs would not be supported. All it said was to not underestimate the integrated GPU in Apple Silicon. Even the chart (Apple Silicon Mac GPU) these sites, included as proof, shows that Apple Silicon Macs will still support "Metal GPU Family Mac 2" APIs which are used to interface with non-Apple GPUs.
    edited November 2020 chiamattinoz
  • Reply 8 of 50
    Mike or Malcolm, could you please clarify this statement?
    In plugging a Pro Display XDR into a Blackmagic eGPU inserted into a Thunderbolt 3 port, it was found the eGPU enclosure was still detected and functions. The display communicates with the MacBook Pro as normal, complete with video playback.
    What exactly do you mean by "complete with normal video playback"? Do you mean that you can get a video stream to play on a monitor connected to the eGPU? Does this mean you can have a third monitor connected as long as all you're doing is playing video? If so, what about other display output (though unaccelerated)? Can you get a normal, if slow, desktop?

    Thanks.

    BTW, either way I find the 10GbE thing more interesting, as it strongly suggests there's more I/O on the M1 that we'd otherwise have any reason to assume. I haven't seen anyone talking about that, but it's big news - both because it implies more lanes of PCIe support, and because that's still within their insanely small power envelope.

    edited November 2020
  • Reply 9 of 50
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    Mike or Malcolm, could you please clarify this statement?
    In plugging a Pro Display XDR into a Blackmagic eGPU inserted into a Thunderbolt 3 port, it was found the eGPU enclosure was still detected and functions. The display communicates with the MacBook Pro as normal, complete with video playback.
    What exactly do you mean by "complete with normal video playback"? Do you mean that you can get a video stream to play on a monitor connected to the eGPU? Does this mean you can have a third monitor connected as long as all you're doing is playing video? If so, what about other display output (though unaccelerated)? Can you get a normal, if slow, desktop?

    Thanks.

    BTW, either way I find the 10GbE thing more interesting, as it strongly suggests there's more I/O on the M1 that we'd otherwise have any reason to assume. I haven't seen anyone talking about that, but it's big news - both because it implies more lanes of PCIe support, and because that's still within their insanely small power envelope.

    On the Blackmagic, the connected monitor on USB-C works. It's not doing anything with the video card at all, and is instead connecting with the daisy-chained signal in a USB-C alt mode.

    On eGPUs without a pass-through, like the Razer, the monitor does nothing attached to the PCI-E card. 
    edited November 2020 chiabeowulfschmidt
  • Reply 10 of 50
    Apple has been adding silicon to their ARM implementation since .... forever.
    If there is a market for eGPU in higher end Macs the functionality will be added if not already present.

    After all, it is not as if Apple has never heard of eGPU or that the ARM spec gets in the way.

    So what are we left with ? A lot of concern trolling, and perhaps a sliver of a percent of wannabee Mac mini + eGPU users.
  • Reply 11 of 50
    MplsPMplsP Posts: 3,925member
    I have to agree with other commenters above. These machines are basically entry level machines with a first generation processor and graphics. Even the MacBook ‘Pro’ is a low-level device. There’s nothing wrong with that. I fully expect future processors to have expanded capabilities like the ability to handle more memory, multiple displays, etc. eGPUs may well come with that expansion. For the time being, if you are doing work that requires an eGPU, you won’t be doing it with these machines anyway. 


    williamlondon
  • Reply 12 of 50
    netrox said:
    This is so hilarious. It means absolutely nothing. All TB and USB devices get detected if they are connected. It doesn't mean it's gonna do anything. 
    The bolded is an over-simplification of the handshake and identification process. Devices have to handshake properly to get this identification, and they can't always do that if the operating system or kernel in question doesn't have "hooks" for it. The fact that it's identifying properly in the chain is a positive.

    So, while I agree that it may not ultimately result in anything, the statement that "it means absolutely nothing" isn't true.
    It could simply mean that like the enclosures they simply left that functionality in since it's used for the INTEL Macs. Or like someone else said, maybe Apple is planning to offer their own eGPUs without supporting 3rd GPUs.  It would probably also be more efficient and effective since a M1g would probably be much smaller 5nm and tailored to support the M1. That would be Apple style with no time or money wasted having to support 3rd party cards, plus more profit!
  • Reply 13 of 50
    0. Mx Macs may indeed run Windows natively ... but it will be ARM Windows which even Microsoft acknowledges is largely useless except for (some) first party and web applications. It was why the Surface Duo uses Android instead of Windows. And unless I am wrong Apple has closed the door on emulating Windows themselves. So a solution for x86 and x86-84 Windows applications would need to be via 32 and 64 bit virtualization.

    1. On the eGPU front it is not so much the hardware or even protocols. Instead, the APIs are different. When on Intel this wasn't an issue ... the APIs are the same. But with Apple Silicon it is Metal or nothing. While that is GREAT for iOS and Apple Arcade and other stuff developed for Apple using Apple provided tools, other stuff isn't going to be compatible unless third parties do it, and even if that happens Apple won't support it.

    Remember my previous rants. Apple never wanted to join the Intel platform and support their hardware and software stuff in the first place. They were PROUD of not doing so in the 80s, 90s and early 00s. They only went to Intel, put iTunes on Windows and things like that because it was necessary to survive. 

    Now Apple doesn't need that stuff anymore. Macs are going to sell regardless ... they have a very dedicated customer base especially among creatives and you can't even develop iOS, watchOS etc. apps any other way. So it is going to go back to how it was before. Apple doesn't want you running Windows. They don't want you using GPUs designed for Windows. They want you to rely on their hardware and software and have the opinion that between their first party stuff, the iPad apps coming over plus web and cloud tools, that is more than enough for anyone who wants to make it work. If you cannot or do not want to, it isn't as if Wintel is going anywhere.

    Convergence WITH their mobile hardware and apps. Divergence FROM the Windows stuff that they have always hated. And now that Microsoft is doing more stuff with Android and Samsung on Windows 10 and collaborating with Google in even more areas, even more so.

    It is going to go back to when Apple people and Windows people were different populations who use computers to do different things, which is the way that it has been for most of these companies histories. The key difference is that thanks to Apple's prowess in mobile, there are far more Apple people than there were in 1983, 1993, 2003 or 2013.
    elijahgcg27
  • Reply 14 of 50
    eriamjheriamjh Posts: 1,642member
    While there is no guarantee that eGPUs will be supported, Larry Fine is right.   Some support for eGPU “may” be coming.  

    We sit back and watch.  Email Apple and tell them you want it (if you do).   I don’t know what drives Apple’s addition of any support but it’s usually related to money.  

    I look forward to the Mx-powered Mac Pro in 2022 or ‘23 or ‘24.  I won’t buy one, I but I’m sure it will be fast.  

    iMac 30” running on an M2X for me, please.  
    mattinoz
  • Reply 15 of 50
    cloudguy said:
    0. Mx Macs may indeed run Windows natively ... but it will be ARM Windows which even Microsoft acknowledges is largely useless except for (some) first party and web applications. It was why the Surface Duo uses Android instead of Windows. And unless I am wrong Apple has closed the door on emulating Windows themselves. So a solution for x86 and x86-84 Windows applications would need to be via 32 and 64 bit virtualization.
    Microsoft tried to use x86 emulation for its ARM window and the result was horrid.  The only thing that has worked has been Crossroads with it own fork of WINE which for various practical reasons will not make its way to the general WINE builds though does appear in Wineskin wrappers.
  • Reply 16 of 50
    My thoughts: Apple is going to use this opportunity to kick third party kernel level drivers (common with graphics cards) to the curb. For both security and stability reasons. Nobody expects to be able to install drivers on their iPhone or iPad nor should we on M1 systems.  Apple will have to add the functionality for the accessory to work.

    Other thoughts: Apple used the same enclosure hardware and laptops without touchscreens to quickly get the hardware into people’s hands and to enable people to make direct comparisons between the Intel and M1 hardware with the same size of battery, etc. New form factors that differ from the Intel based Macs are undoubtably in the next update of the M1 systems. I am waiting for the next gen M1 for this reason, but also to allow it to mature a bit more.
  • Reply 17 of 50
    Honestly, guys, I'm a little confused as to why you would want an eGPU at all.

    Performance we have with the M1 chip alone crushes the performance of previous Intel models with discrete chips.  Just the demo in another thread where they change resolution in a blink of an eye instead of watching the display blank and interminable delays shows this.  I have this experience every time I plug in my external monitor with my top-line 16" MacBook Pro with maxed out memory and GPU.  I can't wait to have my next-generation 16" MacBook Pro where it will be beautiful and seamless.

    According to Apple, the reason for the increased performance is that the processor and graphics modules share the same memory and so they avoid endless copying to and fro.  So if we put in an eGPU wouldn't we have the same problems and sluggish performance by definition?


    Detnator
  • Reply 18 of 50
    melgrossmelgross Posts: 33,510member
    Honestly, guys, I'm a little confused as to why you would want an eGPU at all.

    Performance we have with the M1 chip alone crushes the performance of previous Intel models with discrete chips.  Just the demo in another thread where they change resolution in a blink of an eye instead of watching the display blank and interminable delays shows this.  I have this experience every time I plug in my external monitor with my top-line 16" MacBook Pro with maxed out memory and GPU.  I can't wait to have my next-generation 16" MacBook Pro where it will be beautiful and seamless.

    According to Apple, the reason for the increased performance is that the processor and graphics modules share the same memory and so they avoid endless copying to and fro.  So if we put in an eGPU wouldn't we have the same problems and sluggish performance by definition?


    To a great,extent, I agree. I’ve been watching videos on YouTube, some by very experienced pros, that show these machines punching way above their weight, even with just 8GB or,RAM, which seems impossible to me, and them, except that it’s true.

    but there’s also no doubt that higher performance is needed. These machines are surprisingly powerful, but they have limits too. If Apple wants these to supplant the really high end machines out there, they need graphic performance well in advance of the 2.6TF these machines are capable of. And I expect Apple will deliver it. The 16” machines will get a significant improvement. The higher 13” machine will, and so will the pro Mini.

    certainly higher end iMac’s will, and then we have the iMac Pro, and the Mac Pro. There’s simply no way that the graphics performance in these M1 chips would be even close for,those machines. So Apple will have to do something. Making the actual chip large is possible, but only by so much. Smaller process sizes will help, that’s that two years off.

    none of those things, however, will give a boost matches today’s highest performing cards, much less the cards a year or two from now. I believe Apple is keeping some options open, but nobody knows what just yet.
  • Reply 19 of 50
    blastdoorblastdoor Posts: 3,282member
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
  • Reply 20 of 50
    melgrossmelgross Posts: 33,510member
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
    Apple has surprised us with this, and I think that if they wanted to support external GPUs, they could do it differently. They could somehow extent the fabric support that the built-in GPUs use now to support an external GPU. They might have to buffer them which would add some minor latency, but it could work. On the other hand, they could come out with their own GPU module. We see how that tiny bit of silicon can have good performance. What if they have a small plug-in module with another 16 cores, or so, and have that plug into an extended substrate to just add to the cores now there? It could be small, contain a small heat sink of its own, and use just another 20 watts, or so.

    actually they could do this to RAM easily, as it’s on the substrate now, in two separate modules. Extend the substraate outside of the package and pop two more modules in. They’re very small.
Sign In or Register to comment.