I'm not a big Apple fan. But I totally understand why Apple is doing this. Nvidia is very hard to work with and throws its weight around much worse than even Intel with partners. Im sure it's a personal grudge between Jensen and someone in Apple. Nvidia treats their OEM customers very poorly. Jensen needs to learn how to be a true partner and not just a technobully. He hasn't made very many friends outside of people he pays.
Thanks for a fine article. One point: the 680 is not the biggest NVIDIA card you can use in a 5,1.
My firm runs two maxed-out 12-core 5,1 machines, one with multiple 4K-class monitors, and several lesser machines. We had to stop using our GTX 980 cards when Mojave came out; we needed Mojave because our key software wanted it. It turns out that a 780 will work perfectly under Mojave, and provides a bit more kick than a 680. We miss the 980 cards, and their fine performance on multiple monitors, of course. The 780 is competent.
I share the fatigue, disappointment, and frustration expressed by others. My first Apple purchase was an Apple ][ with — count ‘em — 48k RAM and a special chip that allowed lower case letters. This $2500 beauty (with Olympus electric typewriter attached) out-competed a $25,000 DEC dedicated word processor in 1982. I’ve moved along with Apple ever since... through the ///, the Macs, the laptops... we have a museum of old machines. The struggle between open and closed hardware is as old as the struggle between the ][ and the Mac. This struggle is unresolved and eternal. If I may offer a slight variation on the “pro vs. non-pro” argument based on longevity, it would be this. I don’t think Apple is against “pro” users. Apple has lost enthusiasm for hardware tinkerers, customizers, optimizers, experimenters, and the guys who keep trying to squeeze another percent out of their gear. We have grease under our fingernails, scars, spare parts lying around, and a picture of some chrome extravagance hanging on the shop wall, over the carcass of a customized Triumph Bonneville (real carburetors). Everything is just so clean and glossy now. The happy young people in the ads live in a hyper-bright colorful world supervised by “thrilled” people with carefully-arranged casually-downdressed outfits. That’s fine, for them. But if you miss welding and banging, pull up a gas can, have a seat, and let’s toast Woz together.
Aloha, and stay warm.
A gorgeous writ. I wholeheartedly agree with you. Part of my experiences in this world is about understanding processes and machines. One likes to retrace the paths that genius engineers have found. I love to tinker for this reason. You will not see me own a modern "closed shop" car, if anything my next one will be older than the 90s version I drive now (I have motorcycles too). But it appears we have lost the battle to inspire, the "more is better" principle can be sold easier and fuels the lauded economic growth race. Meanwhile the world economy is indebted triple its income by now with no end in sight. It's the age of the Merchants.
Thanks for a fine article. One point: the 680 is not the biggest NVIDIA card you can use in a 5,1.
My firm runs two maxed-out 12-core 5,1 machines, one with multiple 4K-class monitors, and several lesser machines. We had to stop using our GTX 980 cards when Mojave came out; we needed Mojave because our key software wanted it. It turns out that a 780 will work perfectly under Mojave, and provides a bit more kick than a 680. We miss the 980 cards, and their fine performance on multiple monitors, of course. The 780 is competent.
I share the fatigue, disappointment, and frustration expressed by others. My first Apple purchase was an Apple ][ with — count ‘em — 48k RAM and a special chip that allowed lower case letters. This $2500 beauty (with Olympus electric typewriter attached) out-competed a $25,000 DEC dedicated word processor in 1982. I’ve moved along with Apple ever since... through the ///, the Macs, the laptops... we have a museum of old machines. The struggle between open and closed hardware is as old as the struggle between the ][ and the Mac. This struggle is unresolved and eternal. If I may offer a slight variation on the “pro vs. non-pro” argument based on longevity, it would be this. I don’t think Apple is against “pro” users. Apple has lost enthusiasm for hardware tinkerers, customizers, optimizers, experimenters, and the guys who keep trying to squeeze another percent out of their gear. We have grease under our fingernails, scars, spare parts lying around, and a picture of some chrome extravagance hanging on the shop wall, over the carcass of a customized Triumph Bonneville (real carburetors). Everything is just so clean and glossy now. The happy young people in the ads live in a hyper-bright colorful world supervised by “thrilled” people with carefully-arranged casually-downdressed outfits. That’s fine, for them. But if you miss welding and banging, pull up a gas can, have a seat, and let’s toast Woz together.
Aloha, and stay warm.
Huh. Apple doesn't list 780 support. Interesting find.
We already have a solution for this problem. Apple's bean counter mentality, with regard to the Mac, has simply become un-acceptable. This year, in lieu of upgrading desktop iMacs, we will be swapping them out for PC workstations with nvidia graphics. Tim Cook is no Steve Jobs .. He appears to be willing to force Mac users into accepting lower quality hardware to make a miserable few more dollars. Every single Mac that has failed in our shop over the last few years had an AMD graphics solution. In addition, the state of AMD drivers on the Mac is pathetic at best - we have code that works great on even Intel graphics but fails on machines with AMD graphics. We are simply done with this.
As opposed to what? This is the same, logically, as saying that every Mac that's failed in your shop over the last few years had an Intel processor.
But that is not the case. We have not had ANY Macs fail that used nvidia graphics or had only Intel Iris Pro graphics - as on the 13" MacBook Pro. The ONLY commonality of the failed machines was that they had AMD graphics - and in every case it was the GPU that had failed. In addition, we are experiencing bizarre WebGL rendering artifacts on macOS machines with AMD graphics - even using the Google Chrome Web Browser. You can install Windows (using BootCamp) on these same machines and the WebGL rendering artifacts do NOT occur. Clearly the issue is with BUGGY AMD Graphics drivers for macOS - that Apple has completely failed to address. The inability to adhere to web standards is a complete DEAL-BREAKER.
Driver issues are a different matter, and I don’t disagree with you there. However, the service data we have doesn’t bear out your presumption that the AMD GPU is failing at a high rate.
I’m not saying that it isn’t happening to you, though. It just isn’t a sign of anything beyond your own use case.
The most expensive Macs (iMac, iMac Pro, and 15" Mac Book Pro) were "designed for professionals" (I have to put that in quotes) .. but all use AMD graphics. It's sad because Apple had a chance to retain professional users and they are BLOWING IT!!
Ask yourself this question ... How long must WebGL 2 stay an "Experimental Feature" in Safari? We have been testing with it .. and haven't seen them move the needle on quality .. one ... tiny ... bit, in the last year. How long are we supposed to wait? When crappy two-year old Android Tablets can provide WebGL 2 support, but a $2800 15" MacBook Pro ... cannot. THAT IS BLOWING IT!!
So Apple creating their own hardware isn't an option?......
Given the deprecation of OpenGL, it is more or less a given any Apple GPU will have zero support for it. Meaning a lot of software titles, niche and in-house developed applications will never be ported.
Cool. Maybe I'll have to take a look at some of the more popular OpenGL based apps which will no longer work on Mac (and they're too lazy to port) and get together with some engineers + UX people to create Metal versions of them. All the arrogance is going to do is create new opportunities for small dev shops (which become tomorrow's big shops).
The number of people who want to use an Nvidia card as an eGPU or in an old Mac Pro has to be minuscule, right? Certainly smaller than, say, the number of people who want a headphone port on their iPhone.
no matter how minuscle they should allow it especially since eGPU support came around
this is not the headphone jack as it is neither a physical port that has to fit into a formfactor nor is it in any way old tech that has to be replaced by "something better"
it' just plain stupid not to give us support because it would help a lot of people in scientific and content creation industries to stay or switch back to a mac because CUDA is, for better or worse, a standard in a lot of stuff
Scientific and high performance compute goes hand in hand with CUDA. They won, and AMD picks up the stragglers by offering cheaper solutions if you can run OpenCL.
But if the Mac Pro wants to be worthy of the name and attract the right crowd, it really does need Nvidia. These interviews are disheartening to hear. I guess it'll be a Final Cut Pro X box and not very interesting for the rest of us.
Apple must have something under their sleeves. Wait until MacPro is out, Apple will surprise us with their own G1 Graphic Card that runs on Metal driver.
Too little too late. Every GPU accelerated renderer redshift, vray, iray, octane, etc requires CUDA. Unless Apple's also ready to drop real time Ray tracing in there on top of as-of-yet-unannonced 3rd party support there's nothing that will come of any move that direction. The pro subset that needed Macs and would pay the premium for a Mac pro abandoned Apple 2 years ago.
CUDA will never open to other GPU manufacturer, it is Nvidia's crown jewellery. Opening up CUDA is like the absolute last option Nvidia will do. And I don't see these CUDA workflow changes in the next 3 - 4 years.
And that's the problem. People have gone down a vendor-specific path and now they're complaining when they can't use it everywhere. In the same way they're now going to lobby Apple to support it and/or switch to Windows, they could also have lobbied against NVIDIA by choosing a cross-platform alternative in the first place (or switch to that now). They're just being myopic about the companies they choose to lobby against.
I mean, if I designed an app using a proprietary Microsoft SDK, I'm certainly not going to lobby Apple to support that SDK. Or conversely, if I design an app using Metal and then complain it doesn't work on Windows. These are the results of my software design choices.
What lobbying? This isn't Congress. I have projects with deadlines. GPU accelerated renderers require CUDA and they're lightyears faster than CPU renderers and time is money. That was years ago. Just now Apple is realizing it screwed the pooch? Oh let me come running back to the platform that abadoned me and forced me into an expensive shift to Windows after years of waiting for them to put out an actual workstation with user-serviceable parts and some decent longevity
So Apple creating their own hardware isn't an option?......
Given the deprecation of OpenGL, it is more or less a given any Apple GPU will have zero support for it. Meaning a lot of software titles, niche and in-house developed applications will never be ported.
Cool. Maybe I'll have to take a look at some of the more popular OpenGL based apps which will no longer work on Mac (and they're too lazy to port) and get together with some engineers + UX people to create Metal versions of them. All the arrogance is going to do is create new opportunities for small dev shops (which become tomorrow's big shops).
Go ahead and rewrite DAZ Studio for instance. Including all the NVIDIA Iray content created for it. It should keep you busy and away from this forum for quite a while.
You're missing the point. The point is that, underneath all of the bluster, it's just sheer arrogance and blind hatred for a company (emotional decisions) which are driving all of this (and yes, that might be happening on the Apple side too).
I worked in a company which was blindly/emotionally Microsoft-centric for years (in the pre-mobile era), and had to fight management hard to ensure they didn't put their eggs all in one platform. Eventually I was vindicated when customers started asking for mobile versions of their software. Surprise! Suddenly all that hard work to make the software work well on Mac made it possible to port it to iOS in a few months. And all the abstractions made it easy to port the core of it to Android (though the Java bridging made it take much longer than iOS). Keeping an open mind and adapting to change is the only way to survive in the tech industry.
I think you miss the point. Apple used to DEFINE the change, now it is peddling its shit in Samsung gear and "recommending" customers half baked solutions including a broken ecosystem.
The Mac used to be the system of choice for creators of all genres, now it only feels stifling and laggard. How can you even find excuses for not updating PRO systems for years? - For years!
"Amazingly Great" used to be associated with Apple computers now it seems "Amazingly Late" is more appropriate for when they finally get around to updating a machine whether Mac Pro, Mac Mini, MBA, For the amount of revenue that computers and laptops bring in there is no reason why Apple can't update all these machines at least once each two years (maybe every 3 years for something like the MP but no more than that.)
Where the pro’s go - so follows the masses.
Apple is Christopher Walken in The Deer Hunter - except there are 4 bullets in the gun.
The fact that Apples biggest and most loyal fanatics of 25 years are teetering on or leaving in droves - is cause for massive concern. If the new MP is an enclosed, soldered, T2 chipped, insanely overpriced proprietary nightmare... 🔫
Where the pro’s go - so follows the masses.
Apple is Christopher Walken in The Deer Hunter - except there are 4 bullets in the gun.
The fact that Apples biggest and most loyal fanatics of 25 years are teetering on or leaving in droves - is cause for massive concern. If the new MP is an enclosed, soldered, T2 chipped, insanely overpriced proprietary nightmare...
The bolded part hasn't been accurate in 20 years, since computing device use spread beyond the tech savvy. I'm with you about concern, but it isn't because of this statement.
It's really a shame this is happening. nVidia's GPUs are so far ahead of AMD on the performance--per-watt curve that the MacBook Pro would be a much better notebook with an nv GPU than it's current options.
I'm one of the few that been speccing out Ultrabook+eGPU options and the current status on nvidia driver support has removed Apple from consideration. There's a wealth of attractive windows options with TB3 either already out or coming soon (LG Gram 17 looking at you).
Wish they would let nvidia create drivers. It seems like such a petty move that is anti-consumer.
Looking forward to getting that egpu+PC solution online. I can just dual boot Linux and Windows to satisfy both dev and gaming. And have something super portable as well. Apple needs to wake up.
Apple just doesn't allow modern Nvidia GPUs on macOS Mojave, and this is a dramatic change from only six months ago. Given that a new Mac Pro is coming that could support Nvidia cards, and there are already eGPUs that should, it's time that Apple did.
Can I give this article a bazillion upvotes somehow? Or, should we all tweet and email it to Tim?
So Apple creating their own hardware isn't an option?......
Given the deprecation of OpenGL, it is more or less a given any Apple GPU will have zero support for it. Meaning a lot of software titles, niche and in-house developed applications will never be ported.
Cool. Maybe I'll have to take a look at some of the more popular OpenGL based apps which will no longer work on Mac (and they're too lazy to port) and get together with some engineers + UX people to create Metal versions of them. All the arrogance is going to do is create new opportunities for small dev shops (which become tomorrow's big shops).
Go ahead and rewrite DAZ Studio for instance. Including all the NVIDIA Iray content created for it. It should keep you busy and away from this forum for quite a while.
You're missing the point. The point is that, underneath all of the bluster, it's just sheer arrogance and blind hatred for a company (emotional decisions) which are driving all of this (and yes, that might be happening on the Apple side too).
I worked in a company which was blindly/emotionally Microsoft-centric for years (in the pre-mobile era), and had to fight management hard to ensure they didn't put their eggs all in one platform. Eventually I was vindicated when customers started asking for mobile versions of their software. Surprise! Suddenly all that hard work to make the software work well on Mac made it possible to port it to iOS in a few months. And all the abstractions made it easy to port the core of it to Android (though the Java bridging made it take much longer than iOS). Keeping an open mind and adapting to change is the only way to survive in the tech industry.
I think you miss the point. Apple used to DEFINE the change, now it is peddling its shit in Samsung gear and "recommending" customers half baked solutions including a broken ecosystem.
The Mac used to be the system of choice for creators of all genres, now it only feels stifling and laggard. How can you even find excuses for not updating PRO systems for years? - For years!
The main reason I’m looking forward to the new Mac Pro is the end of posts like these.
Apple must have something under their sleeves. Wait until MacPro is out, Apple will surprise us with their own G1 Graphic Card that runs on Metal driver.
Too little too late. Every GPU accelerated renderer redshift, vray, iray, octane, etc requires CUDA. Unless Apple's also ready to drop real time Ray tracing in there on top of as-of-yet-unannonced 3rd party support there's nothing that will come of any move that direction. The pro subset that needed Macs and would pay the premium for a Mac pro abandoned Apple 2 years ago.
Otoy is working on AMD/Intel compatibility for Octane, so is Redshift, I’m sure others are as well. Just saying.
Comments
Ask yourself this question ... How long must WebGL 2 stay an "Experimental Feature" in Safari? We have been testing with it .. and haven't seen them move the needle on quality .. one ... tiny ... bit, in the last year. How long are we supposed to wait? When crappy two-year old Android Tablets can provide WebGL 2 support, but a $2800 15" MacBook Pro ... cannot. THAT IS BLOWING IT!!
But if the Mac Pro wants to be worthy of the name and attract the right crowd, it really does need Nvidia. These interviews are disheartening to hear. I guess it'll be a Final Cut Pro X box and not very interesting for the rest of us.
"Amazingly Great" used to be associated with Apple computers now it seems "Amazingly Late" is more appropriate for when they finally get around to updating a machine whether Mac Pro, Mac Mini, MBA, For the amount of revenue that computers and laptops bring in there is no reason why Apple can't update all these machines at least once each two years (maybe every 3 years for something like the MP but no more than that.)
I'm one of the few that been speccing out Ultrabook+eGPU options and the current status on nvidia driver support has removed Apple from consideration. There's a wealth of attractive windows options with TB3 either already out or coming soon (LG Gram 17 looking at you).
Wish they would let nvidia create drivers. It seems like such a petty move that is anti-consumer.
Looking forward to getting that egpu+PC solution online. I can just dual boot Linux and Windows to satisfy both dev and gaming. And have something super portable as well. Apple needs to wake up.
Or, should we all tweet and email it to Tim?