Video: Nvidia support was abandoned in macOS Mojave, and here's why

Posted:
in macOS
Apple's macOS Mojave is a great software update for most users. But, it isn't for those who are using an Nvidia graphics card in their Mac Pro or inside of an external GPU enclosure, so let's talk about why.

Nvidia Titan Xp for a PCI-E Mac Pro, supported through High Sierra
Nvidia Titan Xp for a PCI-E Mac Pro, supported through High Sierra


Editor's note: This is a complex topic, and AppleInsider's YouTube audience asked for a condensed version of the editorial written in January about the topic. We aren't going to rehash the entire editorial here, but the transcript for the video follows.





The beauty of the modular Mac Pro up until 2012 is that you were able to swap out the graphics cards to keep the Mac Pro up-to-date with the latest graphics rendering technology and performance, but those who opted for Nvidia cards are stuck with old macOS software, and that can be infuriating.

MacOS Mojave dropped support for new Nvidia graphics drivers, except for a couple of chipsets included in still-supported Apple laptops -- all of them outdated.

In the past couple of years, external GPUs have been on the rise, helping Macs with otherwise low graphics performance get a boost for things like video rendering and gaming. For example, the 2018 Mac Mini has serious performance potential with a 6-core i7 processor that outperforms even the best CPU in the 2018 MacBook Pro.

However, its diminutive size means it doesn't house a dedicated graphics card, so for those who need the graphics performance, they have to resort to an eGPU. And with Nvidia drivers not seeing support in macOS Mojave, those who already own Nvidia cards are out of luck.

So why is there no support for Nvidia drivers? What caused this and what can you do about it? We'll tell you what you can do in just a minute, but let's go back in time and see how Apple and Nvidia's relationship fell apart.

The first Mac to include an Nvidia graphics processor was released in 2001, but Apple was still using chips made by ATI, the company that was eventually bought out by AMD in 2008.

In 2004, the Apple Cinema Display was delayed, reportedly because of Nvidia's inability to produce the required graphics card, the GeForce 6800 Ultra DDL.

Then in 2008, Apple's MacBook Pro shipped with Nvidia graphics chips that revolutionized the MacBook by taking over the functions of the Northbridge and Southbridge controllers alongside actual graphics rendering. Because of it, Intel filed a lawsuit against Nvidia, making things a bit complicated for Apple.

Nvidia processor in a 2008 MacBook Pro
Nvidia processor in a 2008 MacBook Pro


Not only that, but Apple had to admit that some 2008 MacBook Pros had faulty Nvidia processors, which led to a class-action lawsuit for Nvidia and lost profits for Apple due to MacBook Pro repairs.

Around the same time, the iPhone transformed the mobile computing market and meant phones now needed GPUs, and Apple decided to go with Samsung instead. At this time, Nvidia believed that its own patents also applied to mobile GPUs, so they filed patent infringement suits against Qualcomm and Samsung, trying to get them and possibly Apple to pay license fees.

In 2016, Apple said no to putting Nvidia processors in the 15-inch MacBook Pro and instead went with AMD, publicly because of performance per watt issues.

AMD recently launched the Radeon 7, a high-performance 7-nanometer GPU that reportedly has drivers on the way for macOS Mojave
Nvidia competitor AMD recently launched the Radeon 7, a high-performance 7-nanometer GPU that reportedly has drivers on the way for macOS Mojave


And now, in 2019, there aren't any functional drivers for modern cards in Mojave at all. In October of 2018, Nvidia issued a public statement stating that Apple fully controls drivers for macOS and that they can't release a driver unless it's approved by Apple.

Basically, there's no giant technical limitation that causes macOS Mojave to be incompatible with Nvidia graphics cards. Someone at Apple simply doesn't want to support Nvidia drivers, possibly because of relational issues from the past.

For the longest time, Apple's professional apps were optimized for OpenCL, which AMD cards run efficiently, and not CUDA, the proprietary framework that Nvidia focuses on. Apple wants Apple apps to run better, its as simple as that.

On an Apple support page for installing Mojave on older Mac Pros, Apple states that Mojave requires a graphics card that supports Metal. Within the list of compatible graphics cards, you'll see two legacy Nvidia cards and quite a few new options from AMD.

Blackmagic eGPU Pro beside a MacBook Pro
Blackmagic eGPU Pro beside a MacBook Pro


So, from the looks of it, until Nvidia and Apple both relent, and decide to meet the other halfway, there won't be Nvidia support for eGPUs, or any Mac Pro that may or may not have PCI-E slots. And, with AMD's Vega 56 and 64 already supporting metal, and Vega 7 support coming soon to macOS, it doesn't look like Apple is in a negotiating mood.

Further complicating matters, is Apple is working on its own GPU technology. It looks like it's already in the iPhone, and it's just a matter of time until it makes it to the Mac.

If you need Nvidia, you can always downgrade to High Sierra, and hope that the two companies come to their senses, for the user's sake.
«13

Comments

  • Reply 1 of 42
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick 🦵 Boot  👢 Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    edited February 2019 muthuk_vanalingam
  • Reply 2 of 42
    ksecksec Posts: 1,569member
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick 🦵 Boot  👢 Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    mcdavewatto_cobra
  • Reply 3 of 42
    dysamoriadysamoria Posts: 3,430member
    Is this relevant to a gamer who, hypothetically, wants to play Windows games on a Windows partition on their Mac? If you have an eGPU supported by Windows... DOES Windows support eGPU?
  • Reply 4 of 42
    Apple seems to just use what's best for them at the time, and AMD has the better performance to power ratio at this time, and Nvidia hasn't offered Apple anything to convince them otherwise (price is still slightly higher on Nvidia).
    It's weird that Apple is trying to avoid the external GPU support side of it though, but they only added eGPU support less than a year ago so its possible they will in the next year, especially if they want the new Mac Pro to be completely modular, but I wouldn't bet on it.
    watto_cobra
  • Reply 5 of 42
    Mike WuertheleMike Wuerthele Posts: 6,930administrator
    dysamoria said:
    Is this relevant to a gamer who, hypothetically, wants to play Windows games on a Windows partition on their Mac? If you have an eGPU supported by Windows... DOES Windows support eGPU?
    it does, but poorly. I've had the most success booting to a non-accelerated display, and playing the game on a second, accelerated, one.
    watto_cobra
  • Reply 6 of 42
    The abandonment of NVIDIA is a two way street. All those creative professionals who require NVIDIA GPUs to run CUDA based applications are being forced to abandon the Mac platform. Windows 10 is pretty usable now days. Mac OS has remained pretty much the same for the past decade or so and Windows caught up. Many still prefer the Mac but once they are forced off the platform, it is going to be really hard for Apple to win them back. On Windows they can use whatever PC build they need. They can tailor the RAM, storage, CPU(s), and GPU(s) to their needs. They can use a touch screen if they want. They can do all this at a fraction of the cost of a "Pro" Macintosh. Once you go Windows, you don't go back.
  • Reply 7 of 42
    jjojjo Posts: 4member
    Apple seems to just use what's best for them at the time, and AMD has the better performance to power ratio at this time, and Nvidia hasn't offered Apple anything to convince them otherwise (price is still slightly higher on Nvidia).
    It's weird that Apple is trying to avoid the external GPU support side of it though, but they only added eGPU support less than a year ago so its possible they will in the next year, especially if they want the new Mac Pro to be completely modular, but I wouldn't bet on it.
    AMD definitely does not have the best performance per watt right now. Not by a long shot. This was true 10-12 years ago, right now it's not even close.
    derekcurrie
  • Reply 8 of 42
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick ߦ妡mp;nbsp;Boot  ߑ⦡mp;nbsp;Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    The A12 and A11 chips use an ARMv8-A-compatible microarchitecture. https://en.wikipedia.org/wiki/Apple_A12 and https://en.wikipedia.org/wiki/Apple_A11
    edited February 2019
  • Reply 9 of 42
    tipootipoo Posts: 1,158member
    Like it or not, CUDA rules scientific compute, while AMD picks up the stragglers with cheaper performance per dollar in OpenCL. 

    The Mac Pro not supporting Nvidia and CUDA would be a big miss, if it's going to be more than a high end Final Cut box. 
  • Reply 10 of 42
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick 🦵 Boot  👢 Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    I certainly do.
  • Reply 11 of 42
    I could have sworn there was all some point in the mid-late 2000's where nVidia did a very reckless leak of an unreleased Apple laptop featuring a new nVidia chip, but googling "apple leak [anything]" reveals too much stuff to dig through… anyway if my memory is correct, that's just about the dumbest move an Apple components supplier could make and certainly explains a lot.
    watto_cobracornchip
  • Reply 12 of 42
    Mike WuertheleMike Wuerthele Posts: 6,930administrator
    Eric_WVGG said:
    I could have sworn there was all some point in the mid-late 2000's where nVidia did a very reckless leak of an unreleased Apple laptop featuring a new nVidia chip, but googling "apple leak [anything]" reveals too much stuff to dig through… anyway if my memory is correct, that's just about the dumbest move an Apple components supplier could make and certainly explains a lot.
    More of the history is covered in the first article about it, linked in the paragraph under the first image in this piece.
    Eric_WVGGwatto_cobracornchip
  • Reply 13 of 42
    That has become a particular sore spot for me: I love the Mac, and I really go out of my way not to touch lesser hardware. Trouble is, my research heavy lifting is done through Matlab, and Matlab GPGPU is CUDA only.

    As it stands, I’m limited not to enjoy the whole benefits of an expensive toolbox from Mathworks, because of vendor policies.

    There are a lot of blame to share: Mathworks for only supporting CUDA, Apple for being strong headed (sometimes), Nvidia for dropping the ball all those years ago (souring the relationship), and myself for being prejudiced about crapware...
    raulcristianwatto_cobra
  • Reply 14 of 42
    sflocalsflocal Posts: 6,136member
    They can do all this at a fraction of the cost of a "Pro" Macintosh. Once you go Windows, you don't go back.
    Wrong... I was in that Windows camp, tried a Mac and never went back.  I support Windows machines with many of my clients for decades, and to this day it reaffirms that I made that right choice.

    Sure, Mac's sucks at games, then again... many Mac users actually use their machines to get actual work done.  

    I've read many of your posts and know pretty much that you're simply here trolling.  So at least try doing a better job at it.
    foljsraulcristianwatto_cobra
  • Reply 15 of 42
    The abandonment of NVIDIA is a two way street. All those creative professionals who require NVIDIA GPUs to run CUDA based applications are being forced to abandon the Mac platform. Windows 10 is pretty usable now days. Mac OS has remained pretty much the same for the past decade or so and Windows caught up. Many still prefer the Mac but once they are forced off the platform, it is going to be really hard for Apple to win them back. On Windows they can use whatever PC build they need. They can tailor the RAM, storage, CPU(s), and GPU(s) to their needs. They can use a touch screen if they want. They can do all this at a fraction of the cost of a "Pro" Macintosh. Once you go Windows, you don't go back.
    So why are you here?
    raulcristianMacProwatto_cobra
  • Reply 16 of 42
    melgrossmelgross Posts: 33,644member
    Apple developed Open CL, and then gave the license to it away. It became a bit of a standard itself, and is actually very good. and rushed to support it, because they’re in the trailing position. Nvidia hates to support anything they haven’t developed, and also refuse to license it out to AMD. So that’s a bit of a problem. But Apple’s development of Metal has made both of these technologies less important - where Metal is supported.

    nvidia’s cards in some Apple iMacs have also failed over time. Both my wife and daughters 2008 iMacs failed because of the screwed up solder ball problem Nvidia had. I’m not so sure Apple lost much over this as Nvidia had to put over $500 million in escro for manufacturers doing repairs, or replacements, because of it. Apple wasn’t the only one hit by that.

    i still think it’s a mistake by Apple not to support the drivers though.
    edited February 2019 muthuk_vanalingamwatto_cobra
  • Reply 17 of 42
    So the answer to the article’s title is - Because Apple says so?
  • Reply 18 of 42
    mcdavemcdave Posts: 1,927member
    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick ߦ妡mp;nbsp;Boot  ߑ⦡mp;nbsp;Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    The A12 and A11 chips use an ARMv8-A-compatible microarchitecture. https://en.wikipedia.org/wiki/Apple_A12 and https://en.wikipedia.org/wiki/Apple_A11
    The CPU ISA is developed by ARM and pretty much nothing else.  The GPU, ISP, Neural Engine and most other silicon is using Apple’s proprietary ISA (OK imgtec may disagree with that).
    There’s also a good chance that Apple has its own micro architecture which emulates Aarch64 (like Tegra does).  Perhaps one day this will also emulate x86/64 (leveraging a modem deal with Intel) as a migratory step to its own full ISA.
    In the meantime, yes an Apple GPU with a 30W envelope would smash both Nvidia & AMD in perf/watt. Oh and hopefully hardware RayIntersectors to reflect (sorry!) Metal2 too.
    watto_cobra
  • Reply 19 of 42
    mcdavemcdave Posts: 1,927member

    ksec said:
    Recent iOS devices take advantage of the GPU technology recent developed by ARM for their chip designs. That's not going to translate into GPUs for Macs, which are going to continue to use Intel CPU technology (despite uniformed rumors to the contrary). But that doesn't mean Apple can't create their own GPU tech for Macs. I personally would be surprised if Apple bothered, seeing as the company has blatantly suffered from what I call Mac Malaise for over three years. (Kick 🦵 Boot  👢 Prod ⚡). Staying out of GPU patent lawsuits is of course a further concern.
    I don't see any part of the A11 GPU developed by ARM. 
    I certainly do.
    Oh dear, time to visit the optometrist.The CPU is based on ARM ISA.
    watto_cobraderekcurrie
  • Reply 20 of 42
    sflocal said:
    They can do all this at a fraction of the cost of a "Pro" Macintosh. Once you go Windows, you don't go back.
    I was in that Windows camp, tried a Mac and never went back
    Me too but that was years ago and then I was forced to go back. My career depends on NVIDIA and CUDA and no amount of hand waving by Apple fans will change that simple fact. There are a large number of graphics pros in the same boat as me.
Sign In or Register to comment.