Video: Nvidia support was abandoned in macOS Mojave, and here's why

2

Comments

  • Reply 21 of 42
    The abandonment of NVIDIA is a two way street. All those creative professionals who require NVIDIA GPUs to run CUDA based applications are being forced to abandon the Mac platform. Windows 10 is pretty usable now days. Mac OS has remained pretty much the same for the past decade or so and Windows caught up. Many still prefer the Mac but once they are forced off the platform, it is going to be really hard for Apple to win them back. On Windows they can use whatever PC build they need. They can tailor the RAM, storage, CPU(s), and GPU(s) to their needs. They can use a touch screen if they want. They can do all this at a fraction of the cost of a "Pro" Macintosh. Once you go Windows, you don't go back.
    So why are you here?
    Because I still like iOS devices. They run rings around Android because they fully support native code compilation.
    HwGeek
  • Reply 22 of 42
    Nvidia's statement is that Apple controls who can/is making drivers for Mojave.  Is that really the case?  I'm pretty sure any hardware developer can develop drivers for the Mac, get them signed with a developer certificate, and then package them up into an installer if Apple doesn't want to include them in the operating system.  Has something changed with Mojave that isn't being explained here or in the first article?
    watto_cobra
  • Reply 23 of 42
    nVidia could have always release the drivers as a beta... it is really annoying...
    watto_cobra
  • Reply 24 of 42
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick 🦵 Boot  👢 Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    Whatever "technology" used, and iOS devices GPUs have nothing to do with ARM core technology, it's all about languages it uses.. so OpenGL, Metal, Vulkan or DirectX.. so it's going to "translate" into GPUs for Macs quite easilly.
    watto_cobra
  • Reply 25 of 42
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    Nvidia's statement is that Apple controls who can/is making drivers for Mojave.  Is that really the case?  I'm pretty sure any hardware developer can develop drivers for the Mac, get them signed with a developer certificate, and then package them up into an installer if Apple doesn't want to include them in the operating system.  Has something changed with Mojave that isn't being explained here or in the first article?
    Drivers close to the iron, like video cards, now need to be explicitly signed by Apple. It isn't the same process as an app, or a printer driver. This was talked about in the first piece about the matter, linked in the editor's note section.
    edited February 2019 muthuk_vanalingamwatto_cobra
  • Reply 26 of 42
    davgregdavgreg Posts: 1,036member
    The whole eGPU thing is interesting but in current form is very overpriced and the performance is just not there.

    Take a walk over to the Apple website and take a look at the ratings given the BlackMagic units. The eGPU that costs almost as much as a Mac mini is rated 2 1/2 stars https://www.apple.com/shop/product/HM8Y2VC/A/blackmagic-egpu
    The one that costs as much as a MacBook Air has 3 stars.
    https://www.apple.com/shop/product/HMQT2/blackmagic-egpu-pro?fnode=4c

    Not exactly promising and very expensive.
  • Reply 27 of 42
    keithwkeithw Posts: 140member
    dysamoria said:
    Is this relevant to a gamer who, hypothetically, wants to play Windows games on a Windows partition on their Mac? If you have an eGPU supported by Windows... DOES Windows support eGPU?
    it does, but poorly. I've had the most success booting to a non-accelerated display, and playing the game on a second, accelerated, one.

  • Reply 28 of 42
    keithwkeithw Posts: 140member
    I have an iMac Pro and the nVidia eGPU works just fine under Windows 10.  I have a 1080 Ti in the eGPU enclosure, and it consistently out benchmarks equivalent workloads from the internal AMD chip.  I just wish I didn’t have to run Windows to get this functionality.
    watto_cobra
  • Reply 29 of 42
    sean911sean911 Posts: 1unconfirmed, member
    I don't get it, if their relationship is so fractured, why wouldn't Nvidia just release an unsigned or dev version of the drivers? I run plenty of unsigned drivers and when I eventually outgrow high Sierra, I'll probably just mod the HS drivers for Mojave so I can use CUDA if a current signed driver isn't available...
    watto_cobra
  • Reply 30 of 42
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    keithw said:
    I have an iMac Pro and the nVidia eGPU works just fine under Windows 10.  I have a 1080 Ti in the eGPU enclosure, and it consistently out benchmarks equivalent workloads from the internal AMD chip.  I just wish I didn’t have to run Windows to get this functionality.
    I wasn't commenting on speed, or Nvidia support in Windows. The W10 support for eGPUs in general is pretty bad.
    edited February 2019 watto_cobra
  • Reply 31 of 42
    Mike WuertheleMike Wuerthele Posts: 6,858administrator

    sean911 said:
    I don't get it, if their relationship is so fractured, why wouldn't Nvidia just release an unsigned or dev version of the drivers? I run plenty of unsigned drivers and when I eventually outgrow high Sierra, I'll probably just mod the HS drivers for Mojave so I can use CUDA if a current signed driver isn't available...
    You can run unsigned, low-level drivers on High Sierra, it's not a big deal. You can't on Mojave. There are other issues about differences in PCI-E identification in Mojave versus High Sierra, but the former issue is the big-ticket one.
    edited February 2019 watto_cobra
  • Reply 32 of 42
    Hi all, I'm new here :-),
    I must say that IMO, this year APPLE gonna release the new MAC PRO with AMD's 7nm "ThreadRipper" variant, but custom made for APPLE specs (8CH Memory/different clocks?) and also 16GB/32GB Radeon 7.
    APPLE waited enough for Intel, and finally with up-to 64C/128T 7nm ROME CPU there is massive performance improvement that will justifying releasing ALL NEW MAC PRO.
    On mobile side, APPLE will start moving to it's own in-house developed CPU/GPU.

    I am sure APPLE monitor's how many Hackintosh PC are built since there is no new MAC PRO, and APPLE can make really nice profits on 32C/48C/64C MAC PRO's with VEGA VII 16GB/32GB.
    This new 64C/128T ROME based Mac pro could deliver up-to 4 times the processing power om top spec'ed iMac Rpo 18C!.

    Since Nvidia support was abandoned in macOS Mojave, it's just clarifies it.





    edited February 2019 watto_cobra
  • Reply 33 of 42
    melgrossmelgross Posts: 33,510member
    mcdave said:
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick ߦ妡mp;nbsp;Boot  ߑ⦡mp;nbsp;Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    The A12 and A11 chips use an ARMv8-A-compatible microarchitecture. https://en.wikipedia.org/wiki/Apple_A12 and https://en.wikipedia.org/wiki/Apple_A11
    The CPU ISA is developed by ARM and pretty much nothing else.  The GPU, ISP, Neural Engine and most other silicon is using Apple’s proprietary ISA (OK imgtec may disagree with that).
    There’s also a good chance that Apple has its own micro architecture which emulates Aarch64 (like Tegra does).  Perhaps one day this will also emulate x86/64 (leveraging a modem deal with Intel) as a migratory step to its own full ISA.
    In the meantime, yes an Apple GPU with a 30W envelope would smash both Nvidia & AMD in perf/watt. Oh and hopefully hardware RayIntersectors to reflect (sorry!) Metal2 too.
    It’s not that simple though. In order to encompass a 30 watt envelope, there would need to be a massive chip redesign from the ground up. It would be a different chip, and we really don’t know how that would turn out, or what it would be,
  • Reply 34 of 42
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    HwGeek said:
    Hi all, I'm new here :-),
    I must say that IMO, this year APPLE gonna release the new MAC PRO with AMD's 7nm "ThreadRipper" variant, but custom made for APPLE specs (8CH Memory/different clocks?) and also 16GB/32GB Radeon 7.
    APPLE waited enough for Intel, and finally with up-to 64C/128T 7nm ROME CPU there is massive performance improvement that will justifying releasing ALL NEW MAC PRO.
    On mobile side, APPLE will start moving to it's own in-house developed CPU/GPU.

    I am sure APPLE monitor's how many Hackintosh PC are built since there is no new MAC PRO, and APPLE can make really nice profits on 32C/48C/64C MAC PRO's with VEGA VII 16GB/32GB.
    This new 64C/128T ROME based Mac pro could deliver up-to 4 times the processing power om top spec'ed iMac Rpo 18C!.

    Since Nvidia support was abandoned in macOS Mojave, it's just clarifies it.





    I'd be completely on board with that, but I suspect a shift to an AMD CPU isn't going to happen. 
    edited February 2019
  • Reply 35 of 42
    mcdavemcdave Posts: 1,927member
    melgross said:
    mcdave said:
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick ߦ妡mp;nbsp;Boot  ߑ⦡mp;nbsp;Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    The A12 and A11 chips use an ARMv8-A-compatible microarchitecture. https://en.wikipedia.org/wiki/Apple_A12 and https://en.wikipedia.org/wiki/Apple_A11
    The CPU ISA is developed by ARM and pretty much nothing else.  The GPU, ISP, Neural Engine and most other silicon is using Apple’s proprietary ISA (OK imgtec may disagree with that).
    There’s also a good chance that Apple has its own micro architecture which emulates Aarch64 (like Tegra does).  Perhaps one day this will also emulate x86/64 (leveraging a modem deal with Intel) as a migratory step to its own full ISA.
    In the meantime, yes an Apple GPU with a 30W envelope would smash both Nvidia & AMD in perf/watt. Oh and hopefully hardware RayIntersectors to reflect (sorry!) Metal2 too.
    It’s not that simple though. In order to encompass a 30 watt envelope, there would need to be a massive chip redesign from the ground up. It would be a different chip, and we really don’t know how that would turn out, or what it would be,
    It does assume they could scale the GPU beyond 16 clusters/cores and move to a wide memory architecture well beyond 128-bits, possibly HBM2.  Why do you think these changes are beyond possibility?
    watto_cobra
  • Reply 36 of 42
    I'd be completely on board with that, but I suspect a shift to an AMD CPU isn't going to happen. 
    If not, what APPLE could use for their Mac Pro? until Q2 2020 Intel won't gonna have real 10nm competitor even for their current  Top Xeon lineup (current 14nm+++ much better the first/2nd  iteration 10nm parts).
    Sry for going OT.

    P.S, it's interesting why NV CEO could not convince APPLE to use their products, there must be a better reason why APPLE chose to prevent their Mac users to use any Quadro/TITAN/RTX cards rather then just a bad blood between them.
    edited February 2019 watto_cobra
  • Reply 37 of 42
    melgrossmelgross Posts: 33,510member
    mcdave said:
    melgross said:
    mcdave said:
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick ߦ妡mp;nbsp;Boot  ߑ⦡mp;nbsp;Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    The A12 and A11 chips use an ARMv8-A-compatible microarchitecture. https://en.wikipedia.org/wiki/Apple_A12 and https://en.wikipedia.org/wiki/Apple_A11
    The CPU ISA is developed by ARM and pretty much nothing else.  The GPU, ISP, Neural Engine and most other silicon is using Apple’s proprietary ISA (OK imgtec may disagree with that).
    There’s also a good chance that Apple has its own micro architecture which emulates Aarch64 (like Tegra does).  Perhaps one day this will also emulate x86/64 (leveraging a modem deal with Intel) as a migratory step to its own full ISA.
    In the meantime, yes an Apple GPU with a 30W envelope would smash both Nvidia & AMD in perf/watt. Oh and hopefully hardware RayIntersectors to reflect (sorry!) Metal2 too.
    It’s not that simple though. In order to encompass a 30 watt envelope, there would need to be a massive chip redesign from the ground up. It would be a different chip, and we really don’t know how that would turn out, or what it would be,
    It does assume they could scale the GPU beyond 16 clusters/cores and move to a wide memory architecture well beyond 128-bits, possibly HBM2.  Why do you think these changes are beyond possibility?
    I’m not saying that. I’m saying that low power chips are very different from high power chips. Even you are changing what the chip is in this comment. So you realize that you can’t simply take a chip and pump ten, or more, times as much power in and expect it won’t melt down. Making just the changes you suggest results in an entirely different chip.
  • Reply 38 of 42
    mcdavemcdave Posts: 1,927member
    melgross said:
    mcdave said:
    melgross said:
    mcdave said:
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick ߦ妡mp;nbsp;Boot  ߑ⦡mp;nbsp;Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    The A12 and A11 chips use an ARMv8-A-compatible microarchitecture. https://en.wikipedia.org/wiki/Apple_A12 and https://en.wikipedia.org/wiki/Apple_A11
    The CPU ISA is developed by ARM and pretty much nothing else.  The GPU, ISP, Neural Engine and most other silicon is using Apple’s proprietary ISA (OK imgtec may disagree with that).
    There’s also a good chance that Apple has its own micro architecture which emulates Aarch64 (like Tegra does).  Perhaps one day this will also emulate x86/64 (leveraging a modem deal with Intel) as a migratory step to its own full ISA.
    In the meantime, yes an Apple GPU with a 30W envelope would smash both Nvidia & AMD in perf/watt. Oh and hopefully hardware RayIntersectors to reflect (sorry!) Metal2 too.
    It’s not that simple though. In order to encompass a 30 watt envelope, there would need to be a massive chip redesign from the ground up. It would be a different chip, and we really don’t know how that would turn out, or what it would be,
    It does assume they could scale the GPU beyond 16 clusters/cores and move to a wide memory architecture well beyond 128-bits, possibly HBM2.  Why do you think these changes are beyond possibility?
    I’m not saying that. I’m saying that low power chips are very different from high power chips. Even you are changing what the chip is in this comment. So you realize that you can’t simply take a chip and pump ten, or more, times as much power in and expect it won’t melt down. Making just the changes you suggest results in an entirely different chip.
    My thoughts were they would use the power budget to significantly increase GPU core/cluster count.  My understanding is that typical GPU workloads scale well across ‘cores’ so the return should be good as long as the supporting architecture is in place.  The original PowerVR reference designs maxed at 16 clusters but I’m sure the years of effort could break that limit.  Their 7-core GPU on the A12X is matching Radeon Pro 555 performance with a few watts of power, 16-cores would match Vega 20 and 32-cores should match Vega 56.

    Extrapolating Aanandtech’s A12X commentary at full-tilt it never broke 8-watts; a 64-watt budget could theoretically give us 32x performance cores + 64x GPU cores though there would be sacrifices for wide memory.
    watto_cobra
  • Reply 39 of 42
    melgrossmelgross Posts: 33,510member
    mcdave said:
    melgross said:
    mcdave said:
    melgross said:
    mcdave said:
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick ߦ妡mp;nbsp;Boot  ߑ⦡mp;nbsp;Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    The A12 and A11 chips use an ARMv8-A-compatible microarchitecture. https://en.wikipedia.org/wiki/Apple_A12 and https://en.wikipedia.org/wiki/Apple_A11
    The CPU ISA is developed by ARM and pretty much nothing else.  The GPU, ISP, Neural Engine and most other silicon is using Apple’s proprietary ISA (OK imgtec may disagree with that).
    There’s also a good chance that Apple has its own micro architecture which emulates Aarch64 (like Tegra does).  Perhaps one day this will also emulate x86/64 (leveraging a modem deal with Intel) as a migratory step to its own full ISA.
    In the meantime, yes an Apple GPU with a 30W envelope would smash both Nvidia & AMD in perf/watt. Oh and hopefully hardware RayIntersectors to reflect (sorry!) Metal2 too.
    It’s not that simple though. In order to encompass a 30 watt envelope, there would need to be a massive chip redesign from the ground up. It would be a different chip, and we really don’t know how that would turn out, or what it would be,
    It does assume they could scale the GPU beyond 16 clusters/cores and move to a wide memory architecture well beyond 128-bits, possibly HBM2.  Why do you think these changes are beyond possibility?
    I’m not saying that. I’m saying that low power chips are very different from high power chips. Even you are changing what the chip is in this comment. So you realize that you can’t simply take a chip and pump ten, or more, times as much power in and expect it won’t melt down. Making just the changes you suggest results in an entirely different chip.
    My thoughts were they would use the power budget to significantly increase GPU core/cluster count.  My understanding is that typical GPU workloads scale well across ‘cores’ so the return should be good as long as the supporting architecture is in place.  The original PowerVR reference designs maxed at 16 clusters but I’m sure the years of effort could break that limit.  Their 7-core GPU on the A12X is matching Radeon Pro 555 performance with a few watts of power, 16-cores would match Vega 20 and 32-cores should match Vega 56.

    Extrapolating Aanandtech’s A12X commentary at full-tilt it never broke 8-watts; a 64-watt budget could theoretically give us 32x performance cores + 64x GPU cores though there would be sacrifices for wide memory.
    I understand what you’re saying. But the chip can’t simply be added to. You can’t just multiply elements. The chip is of a certain size. To get where you want it would result in a chip with several times the area. That would be the least of the problems, but would increase the cost by a very large factor.

    apples SoCs are fairly small, and have a major performance/power ratio advantage over x86 with IG. I’d like to see Apple make a stand alone GPU for the Mac line too. Very much so, in fact. But this would have to be designed almost from scratch. I wouldn’t be surprised if Apple has been working on that for some time, but particularly since they left Imagination. Most people don’t know that some time ago, Apple bought a small company that designed GPUs. Nothing ever came of it, so it was forgotten. But maybe now, possibly that design work is bearing fruit in Apple’s GPU in the SoC.
    watto_cobra
  • Reply 40 of 42
    Ali NooriAli Noori Posts: 1unconfirmed, member
    Apple is out of leader. Tim is a salesman, J.Ive is as usual just designer, P.Shiller a technical stitcher, they need a leader who gathers these skills. As mentioned in the comments, CUDA is the lead of computational science and Apple is missing this part today. CUDA is powerful tool that is used widely by creative video production as well. It's not easy to leave a quite friendly, once build powerful OS by SteveJ. and presented just for few. I'm looking for windows/Linux alternation.
Sign In or Register to comment.