Apple Silicon M1 Mac detection of Thunderbolt 3 eGPU gives hope for future support

2

Comments

  • Reply 21 of 50
    blastdoorblastdoor Posts: 3,529member
    melgross said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
    Apple has surprised us with this, and I think that if they wanted to support external GPUs, they could do it differently. They could somehow extent the fabric support that the built-in GPUs use now to support an external GPU. They might have to buffer them which would add some minor latency, but it could work. On the other hand, they could come out with their own GPU module. We see how that tiny bit of silicon can have good performance. What if they have a small plug-in module with another 16 cores, or so, and have that plug into an extended substrate to just add to the cores now there? It could be small, contain a small heat sink of its own, and use just another 20 watts, or so.

    actually they could do this to RAM easily, as it’s on the substrate now, in two separate modules. Extend the substraate outside of the package and pop two more modules in. They’re very small.
    It sounds like maybe you're describing a chiplet approach? I agree that makes sense, especially for the lower volume desktop machines. I imagine something about the size of threadripper, with mix and match of CPU, GPU, ML chiplets depending on user needs. Maybe HBM stacks encircling the logic chips. 
  • Reply 22 of 50
    wizard69wizard69 Posts: 13,377member
    lkrupp said:
    Interesting how first generation hardware features are assumed to be etched in stone and immutable. It couldn’t possibly be that Apple wanted to get these entry level machines in the hands of users as son as possible to gauge usage and performance  No, no, assume the worst and start ranting and raving about how ALL ASi (current and future) Macs are useless pieces of garbage because these ENTRY LEVEL machines can’t run Windows natively and don't support eGPU.  It’s happening on all the major Apple centric tech blogs. 

    I’m always ranting myself about how negativity and assuming the worst rules these forums.  No room for positive thinking about the future of these machines and what the next generation might bring. Nope, the first ASi machines don't do this and don't support that so they are DOA.

    While I agree with this for the most part the problem I have is that detection of an eGPU chassis indicates nothing!!!!    It is pretty irresponsible of AI to publish such non sense without labeling it as speculation.   I'm of the opinion that Apple is about to give the eGPU world a smart kick towards the trash can but I have to acknowledge that it is speculation.  EGPU in my mind is one of those technologies that sounds so good on the surface but ends up looking idiotic to the vast majority of users.    This leads to abysmal sales and likely has Apple considering if it is even worth supporting.

    On the other hand I have to completely agree with the idea that these are the very first AS machines and are only targeting the low end.    Why people can't grasp this is beyond me.   Further they follow a very familiar pattern when working with Apple.    That is new features or concepts are often brought to market on old platforms to smooth transitions.    I look at it this way, even the MBA was good enough to get me to buy new Apple Mac hardware again.    I did that knowing full well that buying introductory hardware often isn't that smart with Apple but I've wanted a well done ARM laptop for years now and Apple was the first to deliver a decent machine.    As much as I've hated their management of the Mac line over the last 5 years, to the point of not buying the hardware, these new machines have gotten a lot right for being ENTRY LEVEL.

    Detnator
  • Reply 23 of 50
    wizard69wizard69 Posts: 13,377member
    netrox said:
    This is so hilarious. It means absolutely nothing. All TB and USB devices get detected if they are connected. It doesn't mean it's gonna do anything. 
    The bolded is an over-simplification of the handshake and identification process. Devices have to handshake properly to get this identification, and they can't always do that if the operating system or kernel in question doesn't have "hooks" for it. The fact that it's identifying properly in the chain is a positive.

    So, while I agree that it may not ultimately result in anything, the statement that "it means absolutely nothing" isn't true.

    I have ot disagree here.    Maybe absolutely nothing is a stretch but it is pretty close.    Fundamentally for eGPU to work there would need to be graphics drivers.   The fact that there are none at the moment is not something we can know the why's of.   It could be that the drivers are not ready or that Apple has deemed them not worth the effort.   The third possibility, especially in the case of Apple, is that they don't want to lay all their cards on the table just yet.   That is that they don't want to reveal the GPU hardware of coming version 2 of the Apple Silicon machines.

    So if not absolutely nothing, detection means little as that is a simple result of handling bus connections.
  • Reply 24 of 50
    wizard69wizard69 Posts: 13,377member
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 

    First, there will always be a need for higher performance graphics.   I really can't see anybody denying this but the need will diminish quickly for mainstream users.

    Second the value in UMA have been known for some time.   The advantage Apple has here is that they are all on board with it.   AMD was promoting such systems a few years ago but it never really caught on at MS.   One big advantage is that your GPU can be used much more as the copy penalty doesn't show up so even small routines can be accelerated on a GPU.   UMA is a good technology but it isn't a miracle.   It will be very interesting to see how Apple approaches this on the high performance machines because some tasks simply need more RAM.

    Personally I don't see a good reason for Apple to support eGPU.   I know that rubs a lot of people the wrong way, so be it, I see it as a waste of time.    It would be a waste of time on new Intel machines so this has little to do with M1.    In a nut shell should Apple deploy massive resources to serve 0.00001% of its market?   
    Detnator
  • Reply 25 of 50
    Seems to me the main reason that so many were interested in using an external GPU on the Intel Mini is the poor quality of the onboard integrated graphics. The M1 Mini ships with a dramatically better integrated GPU solution which, for many, means there is no need to go to the trouble of adding an external graphics card. If you need a more capable GPU, I wonder if the Mac Mini is the right product to even be considering. 

    The question, I suppose, is just exactly how many Mac Mini buyers will find that the integrated graphics will provide the performance needed for the Mini to be a good option. If there is a small percentage who need more power, the better approach to making an external GPU usable is for Apple to release an upgrade version - let’s call it the Mac Mini Pro - and slot that between the Mini and the Mac Pro. If that upgraded version is not enough, then clearly we are talking a genuine pro user requiring the ability to do more customizing of the hardware. 

    So the M1 Mini is offered to meet the needs of a large percentage of non-professionals, an upgraded variant is offered to meet the needs of more demanding non-professionals and finally a model designed for the pro customer is developed. Make the pro version the one that can accommodate customization and deliver enough performance in the two non-pro offerings to meet the needs of a substantial number of consumers.

    Really, having to add an external GPU to the Mini in order to get acceptable performance is a flaw of the Intel version that Apple seems to have addressed with the M1 version. Given a choice between a Mini that does not allow for a graphics upgrade yet serves up sufficient graphics muscle and one that can be upgraded yet comes with poor graphics performance baked in, the former clearly is an improvement over the latter. Sure the graphics performance on the base M1 Mini might be a drawback for some but a Pro version of the Mini serving up even more performance would go a long way towards satisfying a good many consumers and at a cost that would be an improvement over having to pay for an external GPU to plug into the Mini. As well, my guess is that Apple would sell a ton of the beefed up Mini to consumers who don’t even need the extra horsepower yet want it anyway. The only way this approach would fail would be if the resulting performance was too anemic to truly meet the needs of the buyer. Based on a good many reviews to date,  it appears that the level of performance of even the just-released M1 Mini, is good enough to please a large percentage of Mini buyers. If the machine comfortably handles what you regularly throw at it, upgradability is rendered irrelevant. 
    Detnator
  • Reply 26 of 50
    Mike WuertheleMike Wuerthele Posts: 6,919administrator
    wizard69 said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 

    First, there will always be a need for higher performance graphics.   I really can't see anybody denying this but the need will diminish quickly for mainstream users.

    Second the value in UMA have been known for some time.   The advantage Apple has here is that they are all on board with it.   AMD was promoting such systems a few years ago but it never really caught on at MS.   One big advantage is that your GPU can be used much more as the copy penalty doesn't show up so even small routines can be accelerated on a GPU.   UMA is a good technology but it isn't a miracle.   It will be very interesting to see how Apple approaches this on the high performance machines because some tasks simply need more RAM.

    Personally I don't see a good reason for Apple to support eGPU.   I know that rubs a lot of people the wrong way, so be it, I see it as a waste of time.    It would be a waste of time on new Intel machines so this has little to do with M1.    In a nut shell should Apple deploy massive resources to serve 0.00001% of its market?   
    I guarantee that there are more than ten eGPUs connected to Macs today. There are at least a thousand in the hands of the DOD that I am aware of, and that itself may be enough to justify future support. There will always be a demand for more. In this case, more performance than the integral GPU provides. A Vega 56 outperforms existing M1 performance.

    Given that the existing Mac Pro (and presumably, the next generation one) will have PCI-E, I'm not sure I agree with your "massive resources" assessment.
    edited November 2020 roundaboutnow
  • Reply 27 of 50
    wizard69wizard69 Posts: 13,377member
    melgross said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
    Apple has surprised us with this, and I think that if they wanted to support external GPUs, they could do it differently. They could somehow extent the fabric support that the built-in GPUs use now to support an external GPU. They might have to buffer them which would add some minor latency, but it could work. On the other hand, they could come out with their own GPU module. We see how that tiny bit of silicon can have good performance. What if they have a small plug-in module with another 16 cores, or so, and have that plug into an extended substrate to just add to the cores now there? It could be small, contain a small heat sink of its own, and use just another 20 watts, or so.

    actually they could do this to RAM easily, as it’s on the substrate now, in two separate modules. Extend the substraate outside of the package and pop two more modules in. They’re very small.

    It is interesting that your brought up fabric support because that immediately had me thinking AMD's fabric implementation.   In the case of AMD what I'd love to see from them is a fabric interface to the CDNA architecture.   Now could Apple do something similar with an off die GPU?   They certainly could and they also have the option of adopting AMD's fabric interface.   This would be fantastic in a Mac Pro like machine, especially if a fabric interfaced CDNA chip was supplied with every machine.   You would see some delays going off chip but over all it would turn the Mac Pro into a super computer of sorts.   By the way yes I know this has little to do with the GPU side of things, however it would be in Apples best interest to go this route.   Right now the Mac Pro is embarrassingly slow when put up against an AMD Thread Ripper system.   While Apple has fast GPU's they currently can't really compete against the CDNA accelerators.
  • Reply 28 of 50
    wizard69wizard69 Posts: 13,377member
    CarmB said:
    Seems to me the main reason that so many were interested in using an external GPU on the Intel Mini is the poor quality of the onboard integrated graphics. The M1 Mini ships with a dramatically better integrated GPU solution which, for many, means there is no need to go to the trouble of adding an external graphics card. If you need a more capable GPU, I wonder if the Mac Mini is the right product to even be considering. 

    The question, I suppose, is just exactly how many Mac Mini buyers will find that the integrated graphics will provide the performance needed for the Mini to be a good option. If there is a small percentage who need more power, the better approach to making an external GPU usable is for Apple to release an upgrade version - let’s call it the Mac Mini Pro - and slot that between the Mini and the Mac Pro. If that upgraded version is not enough, then clearly we are talking a genuine pro user requiring the ability to do more customizing of the hardware. 

    So the M1 Mini is offered to meet the needs of a large percentage of non-professionals, an upgraded variant is offered to meet the needs of more demanding non-professionals and finally a model designed for the pro customer is developed. Make the pro version the one that can accommodate customization and deliver enough performance in the two non-pro offerings to meet the needs of a substantial number of consumers.

    Really, having to add an external GPU to the Mini in order to get acceptable performance is a flaw of the Intel version that Apple seems to have addressed with the M1 version. Given a choice between a Mini that does not allow for a graphics upgrade yet serves up sufficient graphics muscle and one that can be upgraded yet comes with poor graphics performance baked in, the former clearly is an improvement over the latter. Sure the graphics performance on the base M1 Mini might be a drawback for some but a Pro version of the Mini serving up even more performance would go a long way towards satisfying a good many consumers and at a cost that would be an improvement over having to pay for an external GPU to plug into the Mini. As well, my guess is that Apple would sell a ton of the beefed up Mini to consumers who don’t even need the extra horsepower yet want it anyway. The only way this approach would fail would be if the resulting performance was too anemic to truly meet the needs of the buyer. Based on a good many reviews to date,  it appears that the level of performance of even the just-released M1 Mini, is good enough to please a large percentage of Mini buyers. If the machine comfortably handles what you regularly throw at it, upgradability is rendered irrelevant. 

    The problem I see is that if you think the eGPU is the right solution then it is very likely that you purchased the wrong hardware in the first place.   It perplexes me to no end that somebody would buy a Mac Mini and then attach an eGPU to it.    I can almost understand with a laptop, especially if the eGPU box can also fill the roll of a dock but why do that with a desktop?   Makes no sense to me.

    The only argument that really holds with me is that Apple doesn't make a mid-range desktop machine.   That sorta make sense as Apple has this silly belief that the iMac fits every need on the desktop.   In my case when Apple doesn't have what I need i just build a Linux box that fits the purpose.   That way I don't have to compromise with hardware that marginally meets a need.
  • Reply 29 of 50
    wizard69wizard69 Posts: 13,377member
    wizard69 said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 

    First, there will always be a need for higher performance graphics.   I really can't see anybody denying this but the need will diminish quickly for mainstream users.

    Second the value in UMA have been known for some time.   The advantage Apple has here is that they are all on board with it.   AMD was promoting such systems a few years ago but it never really caught on at MS.   One big advantage is that your GPU can be used much more as the copy penalty doesn't show up so even small routines can be accelerated on a GPU.   UMA is a good technology but it isn't a miracle.   It will be very interesting to see how Apple approaches this on the high performance machines because some tasks simply need more RAM.

    Personally I don't see a good reason for Apple to support eGPU.   I know that rubs a lot of people the wrong way, so be it, I see it as a waste of time.    It would be a waste of time on new Intel machines so this has little to do with M1.    In a nut shell should Apple deploy massive resources to serve 0.00001% of its market?   
    I guarantee that there are more than ten eGPUs connected to Macs today. There are at least a thousand in the hands of the DOD that I am aware of, and that itself may be enough to justify future support. There will always be a demand for more. In this case, more performance than the integral GPU provides. A Vega 56 outperforms existing M1 performance.

    Given that the existing Mac Pro (and presumably, the next generation one) will have PCI-E, I'm not sure I agree with your "massive resources" assessment.

    1000 isn't a lot of systems and yes there are likely ore elsewhere.   The problem is I've yet to see any indication sales wise that eGPU's have been a massive sales success for the vendors selling them.   As for the Mac Pro replacement, I'm not sure what Apple will do here.   PCI-E is highly likely but that doesn't mean they will support GPU's in the slots.   I would hope so but I'm not making the decisions at Apple.

    There has been a lot of noise in the community about Apple dropping support ofr GPU's in favor of their in house designs.    I'm not sure I believe those rumors but stranger things have happened.   The trick for Apple would be to get GPU performance similar to current generation NVidia and AMD GPU's.    I say similar because Apple is obviously using the GPU differently and is relying on hardware encoders/decoders to a greater extent than can currently be done on Intel/AMD hardware.    That is to say the GPU cores are not everything and all Apple needs to accomplish here is good enough.

    It will be very interesting to see what the new machines in 2021 look like.   Right now I'm extremely please with my MBA buy and frankly didn't think I would be buying such a Mac anytime soon.   One driver for that purchase was simply due to the ARM hardware onboard and Apple being the first to offer such.   In my case the purchase is more about being a better machine than an iPad of my travel needs.   MBA with M1 has that down pat.   Could I want more, certainly, but I don't see an eGPU in my future.
  • Reply 30 of 50
    wizard69 said:


    The problem I see is that if you think the eGPU is the right solution then it is very likely that you purchased the wrong hardware in the first place.   It perplexes me to no end that somebody would buy a Mac Mini and then attach an eGPU to it.    I can almost understand with a laptop, especially if the eGPU box can also fill the roll of a dock but why do that with a desktop?   Makes no sense to me.

    The only argument that really holds with me is that Apple doesn't make a mid-range desktop machine.   That sorta make sense as Apple has this silly belief that the iMac fits every need on the desktop.   In my case when Apple doesn't have what I need i just build a Linux box that fits the purpose.   That way I don't have to compromise with hardware that marginally meets a need.
    Clearly, it makes more sense for Apple to go back to offering a mid-range desktop machine rather than compromise the unified approach that allows the M1 to deliver some rather surprising performance. My guess is that with the direction it’s going in with its own processor technology,  delivering that mid-range desktop option would be easier than ever. To me, there’s profit to be made in offering an upgrade option to the base Mini on account of consumers who really don’t need the extra power will spend more to acquire it anyway. I’m guilty of opting for overkill on more than a few occasions. It’s less about how many customers need more performance than the new M1 Mini offers and more about how many customers would gladly pay more for a more powerful option. There is a difference. 
  • Reply 31 of 50
    melgrossmelgross Posts: 33,600member
    wizard69 said:
    melgross said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
    Apple has surprised us with this, and I think that if they wanted to support external GPUs, they could do it differently. They could somehow extent the fabric support that the built-in GPUs use now to support an external GPU. They might have to buffer them which would add some minor latency, but it could work. On the other hand, they could come out with their own GPU module. We see how that tiny bit of silicon can have good performance. What if they have a small plug-in module with another 16 cores, or so, and have that plug into an extended substrate to just add to the cores now there? It could be small, contain a small heat sink of its own, and use just another 20 watts, or so.

    actually they could do this to RAM easily, as it’s on the substrate now, in two separate modules. Extend the substraate outside of the package and pop two more modules in. They’re very small.

    It is interesting that your brought up fabric support because that immediately had me thinking AMD's fabric implementation.   In the case of AMD what I'd love to see from them is a fabric interface to the CDNA architecture.   Now could Apple do something similar with an off die GPU?   They certainly could and they also have the option of adopting AMD's fabric interface.   This would be fantastic in a Mac Pro like machine, especially if a fabric interfaced CDNA chip was supplied with every machine.   You would see some delays going off chip but over all it would turn the Mac Pro into a super computer of sorts.   By the way yes I know this has little to do with the GPU side of things, however it would be in Apples best interest to go this route.   Right now the Mac Pro is embarrassingly slow when put up against an AMD Thread Ripper system.   While Apple has fast GPU's they currently can't really compete against the CDNA accelerators.
    Apparently Apple invented their own fabric implementation. We don’t know all that much about it yet, though Anandtech was able to glean some info from it. Apple of course has an accelerator for ProRez, and are working with Red to enable at least one of the two Red RAW formats. It’s completely programmable and reformable, so it could be turned into most anything with software.

    when I think about the size of the GPU section of the M1, and see how small it is, I can imagine it being replicated outside the chip in multiples. It just needs to be connected to the fabric as the GPU on the due. I can’t say it’s definitely possible, but in looking at it, it seems possible. And as I mentioned earlier, RAM is far easier as it’s already on the substrate, and connected to the fabric. There’s really no difference, other than perhaps a small latency hit. But it would be worth it for the gain in performance. The latency from the CPU, memory and GPU now, in what I’m going to think of as “classic” design principles is far worse.
    commentzilla
  • Reply 32 of 50
    melgrossmelgross Posts: 33,600member

    CarmB said:
    wizard69 said:


    The problem I see is that if you think the eGPU is the right solution then it is very likely that you purchased the wrong hardware in the first place.   It perplexes me to no end that somebody would buy a Mac Mini and then attach an eGPU to it.    I can almost understand with a laptop, especially if the eGPU box can also fill the roll of a dock but why do that with a desktop?   Makes no sense to me.

    The only argument that really holds with me is that Apple doesn't make a mid-range desktop machine.   That sorta make sense as Apple has this silly belief that the iMac fits every need on the desktop.   In my case when Apple doesn't have what I need i just build a Linux box that fits the purpose.   That way I don't have to compromise with hardware that marginally meets a need.
    Clearly, it makes more sense for Apple to go back to offering a mid-range desktop machine rather than compromise the unified approach that allows the M1 to deliver some rather surprising performance. My guess is that with the direction it’s going in with its own processor technology,  delivering that mid-range desktop option would be easier than ever. To me, there’s profit to be made in offering an upgrade option to the base Mini on account of consumers who really don’t need the extra power will spend more to acquire it anyway. I’m guilty of opting for overkill on more than a few occasions. It’s less about how many customers need more performance than the new M1 Mini offers and more about how many customers would gladly pay more for a more powerful option. There is a difference. 
    It’s not that simple. We don’t know how far, and in what direction(s) Apple has designed this concept to expand into. This first, least expensive chip is likely the simplest one Apple will make.
  • Reply 33 of 50
    Supporting eGPUs here is not just a matter of writing a graphics driver. The SoC is so integrated that it’s inherently designed to work together. If you add an eGPU you are essentially ‘breaking’ this concept. 

    Only Apple knows for real, but my guess is that Apple is working on powerful alternatives as part of their SoC roadmap for 2021 that simply removes the user desire for eGPUs at least for the vast majority of users, except maybe serving the very high-end for now equivalent to sticking multiple $3000 GPUs in the Mac Pro.

    One thing they need to address are more recent advancements like hardware accelerated ray tracing. They also need to make sure to have at least a solution for high-end needs around the 10 teraflop range.
    I think they’ll probably end up with a 4-5 teraflop solution on the mid-end and 8-10 teraflop on the high-end. 
    edited November 2020 Detnator
  • Reply 34 of 50
    I keep thinking that 3rd party GPU is over on all levels.
  • Reply 35 of 50
    The M1x chip will beat all but the 24/28 vote models with whatever GPUs they have. Then the last chip for the Mac Pro will end Intel/AMD superiority.
  • Reply 36 of 50
    melgross said:

    CarmB said:
    wizard69 said:


    The problem I see is that if you think the eGPU is the right solution then it is very likely that you purchased the wrong hardware in the first place.   It perplexes me to no end that somebody would buy a Mac Mini and then attach an eGPU to it.    I can almost understand with a laptop, especially if the eGPU box can also fill the roll of a dock but why do that with a desktop?   Makes no sense to me.

    The only argument that really holds with me is that Apple doesn't make a mid-range desktop machine.   That sorta make sense as Apple has this silly belief that the iMac fits every need on the desktop.   In my case when Apple doesn't have what I need i just build a Linux box that fits the purpose.   That way I don't have to compromise with hardware that marginally meets a need.
    Clearly, it makes more sense for Apple to go back to offering a mid-range desktop machine rather than compromise the unified approach that allows the M1 to deliver some rather surprising performance. My guess is that with the direction it’s going in with its own processor technology,  delivering that mid-range desktop option would be easier than ever. To me, there’s profit to be made in offering an upgrade option to the base Mini on account of consumers who really don’t need the extra power will spend more to acquire it anyway. I’m guilty of opting for overkill on more than a few occasions. It’s less about how many customers need more performance than the new M1 Mini offers and more about how many customers would gladly pay more for a more powerful option. There is a difference. 
    It’s not that simple. We don’t know how far, and in what direction(s) Apple has designed this concept to expand into. This first, least expensive chip is likely the simplest one Apple will make.
    Apple plays the long game. As such, it’s pretty much impossible that the M1 is potentially where this ends. If it were, guaranteed that Apple would not have committed to transitioning away from Intel. It would be a safe bet that Apple will advance its hardware pretty much annually, as will the rest of the industry. Apple has nothing to prove in regards to its ability to do this sort of thing. Certainly Apple would not leave to chance its ability to take this where it needs to go for quite a few years out. It’s not as if they just decided to start transitioning away from Intel and then play it day by day, hoping it works out. They’ve planned this out, years in advance. They always do. 
    Detnator
  • Reply 36 of 50
    melgross said:

    CarmB said:
    wizard69 said:


    The problem I see is that if you think the eGPU is the right solution then it is very likely that you purchased the wrong hardware in the first place.   It perplexes me to no end that somebody would buy a Mac Mini and then attach an eGPU to it.    I can almost understand with a laptop, especially if the eGPU box can also fill the roll of a dock but why do that with a desktop?   Makes no sense to me.

    The only argument that really holds with me is that Apple doesn't make a mid-range desktop machine.   That sorta make sense as Apple has this silly belief that the iMac fits every need on the desktop.   In my case when Apple doesn't have what I need i just build a Linux box that fits the purpose.   That way I don't have to compromise with hardware that marginally meets a need.
    Clearly, it makes more sense for Apple to go back to offering a mid-range desktop machine rather than compromise the unified approach that allows the M1 to deliver some rather surprising performance. My guess is that with the direction it’s going in with its own processor technology,  delivering that mid-range desktop option would be easier than ever. To me, there’s profit to be made in offering an upgrade option to the base Mini on account of consumers who really don’t need the extra power will spend more to acquire it anyway. I’m guilty of opting for overkill on more than a few occasions. It’s less about how many customers need more performance than the new M1 Mini offers and more about how many customers would gladly pay more for a more powerful option. There is a difference. 
    It’s not that simple. We don’t know how far, and in what direction(s) Apple has designed this concept to expand into. This first, least expensive chip is likely the simplest one Apple will make.
    If the basic concept has been demonstrated to work beyond Apple’s wildest expectations, it’s obvious that it will progress to develop more powerful variations to be offered to more demanding users. It’s not as if Apple is likely to develop a different operating system and basic approach to system design to deliver more performance. Rather, it’s going to expand on what is the starting point that is the M1. The more capable variants will use the same software and so the basic architecture will likely be employed across the product range, since the software is optimized for a particular system design approach. To then veer off and alter the basic approach, an approach that seems to be rather effective, makes no sense. It’s also improbable that as it launches the M1, Apple is in danger of running out of runway. Apple plays the long game, always. 
    Detnator
  • Reply 38 of 50
    melgrossmelgross Posts: 33,600member
    CarmB said:
    melgross said:

    CarmB said:
    wizard69 said:


    The problem I see is that if you think the eGPU is the right solution then it is very likely that you purchased the wrong hardware in the first place.   It perplexes me to no end that somebody would buy a Mac Mini and then attach an eGPU to it.    I can almost understand with a laptop, especially if the eGPU box can also fill the roll of a dock but why do that with a desktop?   Makes no sense to me.

    The only argument that really holds with me is that Apple doesn't make a mid-range desktop machine.   That sorta make sense as Apple has this silly belief that the iMac fits every need on the desktop.   In my case when Apple doesn't have what I need i just build a Linux box that fits the purpose.   That way I don't have to compromise with hardware that marginally meets a need.
    Clearly, it makes more sense for Apple to go back to offering a mid-range desktop machine rather than compromise the unified approach that allows the M1 to deliver some rather surprising performance. My guess is that with the direction it’s going in with its own processor technology,  delivering that mid-range desktop option would be easier than ever. To me, there’s profit to be made in offering an upgrade option to the base Mini on account of consumers who really don’t need the extra power will spend more to acquire it anyway. I’m guilty of opting for overkill on more than a few occasions. It’s less about how many customers need more performance than the new M1 Mini offers and more about how many customers would gladly pay more for a more powerful option. There is a difference. 
    It’s not that simple. We don’t know how far, and in what direction(s) Apple has designed this concept to expand into. This first, least expensive chip is likely the simplest one Apple will make.
    If the basic concept has been demonstrated to work beyond Apple’s wildest expectations, it’s obvious that it will progress to develop more powerful variations to be offered to more demanding users. It’s not as if Apple is likely to develop a different operating system and basic approach to system design to deliver more performance. Rather, it’s going to expand on what is the starting point that is the M1. The more capable variants will use the same software and so the basic architecture will likely be employed across the product range, since the software is optimized for a particular system design approach. To then veer off and alter the basic approach, an approach that seems to be rather effective, makes no sense. It’s also improbable that as it launches the M1, Apple is in danger of running out of runway. Apple plays the long game, always. 
    I don’t see Apple spending a lot of time and resources on developing an IP here that isn’t extendable. It can’t be so rigidly designed that Apple would have to redesign the architecture every time they need to have a more powerful and capable machine. 
    Detnator
  • Reply 39 of 50
    Supporting eGPUs here is not just a matter of writing a graphics driver. The SoC is so integrated that it’s inherently designed to work together. If you add an eGPU you are essentially ‘breaking’ this concept. 

    I completely agree. 3rd party graphics cards seem to make less sense the more and more I think about it. The Intel chip isn't the only barn burner in the thermal envelope and a third party card would make the unified memory somewhat ineffective. The only model I could see retaining 3rd party card support is the MAcPro since it has PCIe slots. If Apple was to support an external GPU it makes more sense for them to make them themselves, if they are going to replace 3rd party cards in the laptops and the iMac. Plus, they could provide rock solid driver support.

    I still suspect there will on be three chips, the M1, M1x and M1xe. It's the fastest and simplest route to completion in the next 18-months. If you do the math, a 16-core M1x should match or beat a 16-Core Xeon in the MAcPro, leaving only the 24 and 28-core models unmatched while waiting on the M1ex. Using the exact same chip in the laptops and desktops makes a lot of sense, since they can simply provide more wattage and cooling to add value for greater sustained performance. then when they get around to the M2 they'll be able to provide more tiers, like a $599 MacMini and a $899 MBA. They've had those price points before but they are always designed to up-sell.

    M1 = Low-end laptops, low-end desktops, and the Mac Mini. (4x4 cores with 8GB to 16GB)
    M1x = High-end laptops and high-end desktops (8x8 cores with 16GB to 128GB memory)
    M1xe = MacPro (???)
    DetnatorCheeseFreeze
  • Reply 40 of 50

    melgross said:
    wizard69 said:
    melgross said:
    blastdoor said:
    The way Apple has been talking about the benefits of an integrated GPU and UMA makes me really question what eGPU support will look like if it happens at all. The points Apple makes about the value of integration and UMA strike me as much more than just marketing -- they are thoughtful, substantive points. It really is a big deal to have as a fundamental assumption of the entire hardware and software stack that there is one unified pool of memory that all execution units can access equally. 

    If Apple offers supports for eGPUs (or PCIe GPUs in a Mac Pro), I wonder if it will be only for compute purposes, not for display. And I further wonder if it will be a specialized, separate kind of thing, sort of like how Afterburner is in the current Mac Pro. In other words, it would be expensive, specialized hardware for a narrow purpose. It would not be a GPU that is in any way on equal footing with the integrated GPU. 
    Apple has surprised us with this, and I think that if they wanted to support external GPUs, they could do it differently. They could somehow extent the fabric support that the built-in GPUs use now to support an external GPU. They might have to buffer them which would add some minor latency, but it could work. On the other hand, they could come out with their own GPU module. We see how that tiny bit of silicon can have good performance. What if they have a small plug-in module with another 16 cores, or so, and have that plug into an extended substrate to just add to the cores now there? It could be small, contain a small heat sink of its own, and use just another 20 watts, or so.

    actually they could do this to RAM easily, as it’s on the substrate now, in two separate modules. Extend the substraate outside of the package and pop two more modules in. They’re very small.

    It is interesting that your brought up fabric support because that immediately had me thinking AMD's fabric implementation.   In the case of AMD what I'd love to see from them is a fabric interface to the CDNA architecture.   Now could Apple do something similar with an off die GPU?   They certainly could and they also have the option of adopting AMD's fabric interface.   This would be fantastic in a Mac Pro like machine, especially if a fabric interfaced CDNA chip was supplied with every machine.   You would see some delays going off chip but over all it would turn the Mac Pro into a super computer of sorts.   By the way yes I know this has little to do with the GPU side of things, however it would be in Apples best interest to go this route.   Right now the Mac Pro is embarrassingly slow when put up against an AMD Thread Ripper system.   While Apple has fast GPU's they currently can't really compete against the CDNA accelerators.
    Apparently Apple invented their own fabric implementation. We don’t know all that much about it yet, though Anandtech was able to glean some info from it. Apple of course has an accelerator for ProRez, and are working with Red to enable at least one of the two Red RAW formats. It’s completely programmable and reformable, so it could be turned into most anything with software.

    when I think about the size of the GPU section of the M1, and see how small it is, I can imagine it being replicated outside the chip in multiples. It just needs to be connected to the fabric as the GPU on the due. I can’t say it’s definitely possible, but in looking at it, it seems possible. And as I mentioned earlier, RAM is far easier as it’s already on the substrate, and connected to the fabric. There’s really no difference, other than perhaps a small latency hit. But it would be worth it for the gain in performance. The latency from the CPU, memory and GPU now, in what I’m going to think of as “classic” design principles is far worse.
    After seeing some of the real-word reviews of the M1 performing tasks like playing back 4k/8k RAW video with filters in realtime, Intel and AMD may be forced to rethink the whole PC component model. It's hard to see how they can compete when a $60 M1 can challenge an 8-core i9 with a Radeon RX 560 or better. Price wise, I don't think you can get a PC laptop like that for the price of a M1 MBP and even if you did the battery life would be horrible unless the laptop were an inch thick for a bigger battery with noisy fans to boot.

    I wouldn't be surprised if Intel and AMD copy Apple's design. BYO people are not going to like that but it's hard to any other outcome at this point.
    Detnator
Sign In or Register to comment.