Apple silicon Mac documentation suggests third-party GPU support in danger

124

Comments

  • Reply 61 of 86
    avon b7avon b7 Posts: 8,324member
    Rayz2016 said:
    avon b7 said:
    Beats said:
    jonahlee said:
    If they come up with some new port it instead of PCI it will likely only mean Apple cards work with it, and it will mean the expansion is a joke. And while an Apple GPU might be decent, it is not going to compete with AMD and NVIDIA mainly because of driver support, because of the smaller market share unless Apple can magically beat the GPU giants at their own game without making something super hot and without a huge power draw, that is also so much easier to program for.

    Oh goody this again. Happens every time.

    "Apple is not going to beat out market leaders."

    You clearly didn't understand what he was saying and just went on the defensive. 

    His comments are more than valid given current market realities and Apple's own experience and history. 

    We don't know where this will go. Many things are possible and nothing is sure to be a guaranteed success.
    @Beats didn’t say anything about it being a success or not. He merely pointed out that “Apple can’t compete with /*insert industry stalwart that Apple will run roughshod over in a market they couldn’t possibly think they could compete in*/” statement has been heard a number of times before. 

    Most famously with the iPhone, and then again with the Apple Watch. 

    Saying they cannot compete with AMD and NVIDIA is even more non-sensical because they’re not even going to try. Apple is not aiming to produce a graphics subsystem that has to have to world-class performance on a unlimited range of machines that they have no access to. 

    They are trying to produce world-class performance on a graphics system they’ve designed, running on an architecture they designed,  running an operating system optimised for the chips they designed, and running software built using a framework optimised for the operating system optimised for the chipset they designed – all from the ground up. That’s why it would be unwise to count them out. 

    What the AMD fans will do is post loads of charts and calculations and satisfy themselves that the AMD solution is faster on paper. 

    Apple will just run a Windows machine and an Apple Silicon Mac side-by-side and say, “There ya go.”




    I didn't say he mentioned success or not. That was my observation. He just went on the defensive. 

    The problem was he didn't really offer anything at all. The OP laid out the reasoning behinds his thoughts. You can agree with them or not but they are perfectly valid observations. 


     0Likes 0Dislikes 0Informatives
  • Reply 62 of 86
    KITAkita Posts: 410member
    KITA said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation,


    Of course you don’t have to spend $14k on one. I purchased one for my studio for about $7500 including some aftermarket upgrades. I’ve owned many Macs since the time they used to be beige and the new MacPro is basically the absolute best Mac ever built. The performance is top notch (it’s not always just about benchmarks) and there are a slew of other reasons you failed to even include in your “analysis” like noise level(which is basically 0 ), top notch components, build quality and materials, overall design (the slip cover chassis is brilliant), expansion options and aesthetics.
    "Top notch performance", but it's getting far out performed by parts in systems at lower prices in most cases. The point in Puget's analysis was showing that not only are you spending more on your initial investment, but if your time is money, you're also losing out considerably. Other workstations are expandable (and actually work with off the shelf components), use high quality parts and are also near silent. Of course, there are different use cases for different machines, if you require software that's only on macOS, alas, you have no other options.
    muthuk_vanalingam
     0Likes 0Dislikes 1Informative
  • Reply 63 of 86
    mjtomlinmjtomlin Posts: 2,699member
    Good lord! Apple really needs to be much more clear about this stuff - or people start flying off the rails with speculation.

    What the chart is showing, is target GPUs for each platform. Apple (and developers) currently have to target for three different GPU architectures. On Apple Silicon based Macs, Apple and developers can always count on there being an Apple GPU, and thus, will only need to support it if desired. This does not imply the limitation of expansion to other GPU architectures should the developer choose to support them.

    What I read from that table is that the "Metal GPU Family Mac 2" APIs will target ALL GPU architectures and the "Metal GPU Family Apple" APIs will only support Apple GPUs.


    Update after watching the dev session...
    It is specifically mentioned that Apple Silicon based Macs can and will support discrete GPUs. That slide image was taken out of context. The session is mainly about optimizing your code when running on Apple Silicon GPUs.

    It is also mentioned that the GPUs on Mac SoCs are not the same as those found on the A-series - they will be much more powerful.
    edited July 2020
    chiafastasleepwatto_cobra
     1Like 0Dislikes 2Informatives
  • Reply 64 of 86
    Rayz2016rayz2016 Posts: 6,957member
    mjtomlin said:
    Good lord! Apple really needs to be much more clear about this stuff - or people start flying off the rails with speculation.

    What the chart is showing, is target GPUs for each platform. Apple (and developers) currently have to target for three different GPU architectures. On Apple Silicon based Macs, Apple and developers can always count on there being an Apple GPU, and thus, will only need to support it if desired. This does not imply the limitation of expansion to other GPU architectures should the developer choose to support them.

    What I read from that table is that the "Metal GPU Family Mac 2" APIs will target ALL GPU architectures and the "Metal GPU Family Apple" APIs will only support Apple GPUs.


    Update after watching the dev session...
    It is specifically mentioned that Apple Silicon based Macs can and will support discrete GPUs. That slide image was taken out of context. The session is mainly about optimizing your code when running on Apple Silicon GPUs.

    It is also mentioned that the GPUs on Mac SoCs are not the same as those found on the A-series - they will be much more powerful.
    I think the problem is that folk want to know if they’re going to support third-party GPUs. 
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 65 of 86
    jidojido Posts: 129member
    KITA said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation, it's actually extremely overpriced and underpowered. It might only be good in certain applications optimized under macOS, outside of that, nope.

    Repost from another thread:

    After Effects

    While Macs often perform fairly well, in After Effects there is simply no argument that a PC workstation is both faster and significantly less expensive. Compared to the $20k Mac Pro we tested, a $4k PC using an Intel Core i9 9900K and NVIDIA GeForce 2080 Ti ended up being about 5% faster overall, while a $5.5k PC using an AMD Threadripper 3960X is about 18% faster. Even compared to the much better priced iMac Pro, a PC that costs $1K less is going to be about 35% faster.
    ...

    https://www.pugetsystems.com/labs/articles/Premiere-Pro-performance-PC-Workstation-vs-Mac-Pro-2019-1719/

    Are these numbers trustworthy?

    It seems strange that the Mac Pro performs poorly in comparison to others with similar configuration.

    This user is not seeing any slowness on his Mac Pro:
    http://www.reduser.net/forum/showthread.php?176610-The-New-Mac-Pro-is-Here!!!&p=1881993&viewfull=1#post1881993
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 66 of 86
    dysamoria said:
    jonahlee said:
    See the thing that worries me most are the little things in apple's actions already. If they have behind the scenes test hardware for all their configurations already, and yet they couldn't manage to get to get Thunderbolt into their developers machine. Does that mean thunderbolt is gone for them, or they couldn't get it working? Either way is not a good thing considering how much they have leaned on Thunderbolt for pro work over the past years.

    And the whole idea of no external graphics card, just everything on one the chip again means less chance of expansion and more chance of just having to keep upgrading the machines every few years. The thing that was so great about the old macpro was the ability upgrade the graphics card, and graphics on the chip are not going to be upgradeable.

    And then there is the whole getting companies to support it, it is great that Adobe is working on it, but what about AVID. It took them this long to support Catalina and that is with AMD graphcis.
    The A12z already worked with USB-C so it was a no brainer for the developer hardware and there was no point adding TB since I'm sure they're going to use a new chip for the production unit. I'm guessing eGPUs will work over TB with drivers once they are available but that's anybodies guess.

    I'm sure Adobe and Avid will come along for the ride once they see how this new hardware outperforms x86. Technically all of their software will already run with a few days of modifications, so they'll already bee there. Xcode is processor agnostic so it's not like they have to rewrite it from the ground up and they'll be able to optimize over time as new version come out.
    How are developers/manufacturers of thunderbolt audio interfaces supposed to test supporting Apple Silicon?
    I suspect Apple is holding back for the big reveal. Plus a two-year transition should be more than enough time for niche developers to get up to speed.
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 67 of 86
    keithw said:
    I'm finding it extraordinary difficult to believe that Apple could replace the existing Xeon-based Mac Pro (or even iMac Pro) with comparable performance & expandability in only two years.  Low end, no problem at all. 
    You seem to assume that they just started working on this yesterday, instead of over the last decade. The current hardware was probably designed with this transition mind. I seriously doubt they are starting from scratch instead of building on the current products. 
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 68 of 86

    tmay said:
    Rayz2016 said:
    Mmm. I’d sort of assumed they’d be doing their own GPUs. 
    Kind of makes me wonder if Apple is even going to use PCI express in their desktop models, instead creating their own expansion bus.
    I wouldn't be surprised if the current hardware was designed with a logic board swap in mind.
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 69 of 86
    KITAkita Posts: 410member
    jido said:
    KITA said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation, it's actually extremely overpriced and underpowered. It might only be good in certain applications optimized under macOS, outside of that, nope.

    Repost from another thread:

    After Effects

    While Macs often perform fairly well, in After Effects there is simply no argument that a PC workstation is both faster and significantly less expensive. Compared to the $20k Mac Pro we tested, a $4k PC using an Intel Core i9 9900K and NVIDIA GeForce 2080 Ti ended up being about 5% faster overall, while a $5.5k PC using an AMD Threadripper 3960X is about 18% faster. Even compared to the much better priced iMac Pro, a PC that costs $1K less is going to be about 35% faster.
    ...

    https://www.pugetsystems.com/labs/articles/Premiere-Pro-performance-PC-Workstation-vs-Mac-Pro-2019-1719/

    Are these numbers trustworthy?

    It seems strange that the Mac Pro performs poorly in comparison to others with similar configuration.

    This user is not seeing any slowness on his Mac Pro:
    http://www.reduser.net/forum/showthread.php?176610-The-New-Mac-Pro-is-Here!!!&p=1881993&viewfull=1#post1881993
    Yes, they're trustworthy.

    The post you linked is comparing DaVinici Resolve (benefits considerably from GPU), not the Adobe applications listed above. The user had added a Radeon VII to their machine (using the 580X for GUI and the Radeon VII 16 GB for compute). Without the 580X managing the GUI, that GPU is already quite competitive with other NVIDIA GPUs for 4K tasks. With the 580X + Radeon VII, it's not surprising their 16 core Intel machine outperformed a 24 core Threadripper with only one NVIDIA GPU - again, especially when this application benefits from GPU so greatly.





    It's not that the Mac Pro is always a poor performer, but that its price is typically much higher - and in this example, the owner had to spend extra to add another GPU (not offered by Apple) to his machine, something that could be added to any workstation with a lower base price.
    muthuk_vanalingam
     0Likes 0Dislikes 1Informative
  • Reply 70 of 86
    mjtomlinmjtomlin Posts: 2,699member
    JinTech said:
    Or you could read it as that this is what the Apple Silicon processor is directly compatible with. It doesn't say they will not support third-party. I could see Apple using their own GPU for primary tasks but third-party for more beefy tasks. Do we really think that Apple could compete with a GPU like the AMD Radeon Pro Vega II for professionals?
    That chart is explicit.  I take it at face value that Apple Silicon Macs will not support 3rd party GPU's.  Anything otherwise is an assumption.

    Uh, not when it's taken out of context, so no one seems to understand what they are looking at and are making assumptions.

    The title of the chart is "Apple Silicon Mac GPU", it is not "Apple Silicon Mac GPU Support"
    The 1st row shows the GPUs being compared... 1. Apple GPU vs. 2. Intel, AMD, and NVIDIA
    The 2nd row shows rendering methods... 1. TBDR vs. 2. IMR
    The 3rd row shows which APIs support each GPU architecture... 1. Both Metal Mac 2 and Metal Apple vs. 2. Just Metal Mac 2

    There is nothing in this table or in the session that states anything about what GPUs Apple Silicon based Macs will and will not support! This table is ONLY telling us the differences between Apple GPU architecture and the others.
    jdb8167mattinozwatto_cobra
     3Likes 0Dislikes 0Informatives
  • Reply 71 of 86
    Beatsbeats Posts: 3,073member
    avon b7 said:
    Beats said:
    jonahlee said:
    If they come up with some new port it instead of PCI it will likely only mean Apple cards work with it, and it will mean the expansion is a joke. And while an Apple GPU might be decent, it is not going to compete with AMD and NVIDIA mainly because of driver support, because of the smaller market share unless Apple can magically beat the GPU giants at their own game without making something super hot and without a huge power draw, that is also so much easier to program for.

    Oh goody this again. Happens every time.

    "Apple is not going to beat out market leaders."

    You clearly didn't understand what he was saying and just went on the defensive. 

    His comments are more than valid given current market realities and Apple's own experience and history. 

    We don't know where this will go. Many things are possible and nothing is sure to be a guaranteed success.

    Rayz2016 said:
    avon b7 said:
    Beats said:
    jonahlee said:
    If they come up with some new port it instead of PCI it will likely only mean Apple cards work with it, and it will mean the expansion is a joke. And while an Apple GPU might be decent, it is not going to compete with AMD and NVIDIA mainly because of driver support, because of the smaller market share unless Apple can magically beat the GPU giants at their own game without making something super hot and without a huge power draw, that is also so much easier to program for.

    Oh goody this again. Happens every time.

    "Apple is not going to beat out market leaders."

    You clearly didn't understand what he was saying and just went on the defensive. 

    His comments are more than valid given current market realities and Apple's own experience and history. 

    We don't know where this will go. Many things are possible and nothing is sure to be a guaranteed success.
    @Beats didn’t say anything about it being a success or not. He merely pointed out that “Apple can’t compete with /*insert industry stalwart that Apple will run roughshod over in a market they couldn’t possibly think they could compete in*/” statement has been heard a number of times before. 

    Most famously with the iPhone, and then again with the Apple Watch. 

    Saying they cannot compete with AMD and NVIDIA is even more non-sensical because they’re not even going to try. Apple is not aiming to produce a graphics subsystem that has to have to world-class performance on a unlimited range of machines that they have no access to. 

    They are trying to produce world-class performance on a graphics system they’ve designed, running on an architecture they designed,  running an operating system optimised for the chips they designed, and running software built using a framework optimised for the operating system optimised for the chipset they designed – all from the ground up. That’s why it would be unwise to count them out. 

    What the AMD fans will do is post loads of charts and calculations and satisfy themselves that the AMD solution is faster on paper. 

    Apple will just run a Windows machine and an Apple Silicon Mac side-by-side and say, “There ya go.”





    It really gets old. Every damn industry Apple even thinks of touching receives an onslaught of doubt and "Apple doesn't understand X industry" comments and articles.

    I wanted to find the quote by a tech journalist that said something like "Apple will not overtake Nokia and Motorola. These giants have been cemented in the industry. Motorola and Nokia are not gonna sit quietly whispering in their clam shells". I couldn't find it on time to post it.

    We're seeing the same crap today. "Apple cannot compete with AMD and Intel." I used to have doubts too but over the years I've learned Apple will do just fine if not, completely destroy companies out of existence.

    And yes, like iKnockoff users, the haters will quote specs because reality is far more inconvenient for them.



    KITA said:
    Beats said:
    jonahlee said:
    If they come up with some new port it instead of PCI it will likely only mean Apple cards work with it, and it will mean the expansion is a joke. And while an Apple GPU might be decent, it is not going to compete with AMD and NVIDIA mainly because of driver support, because of the smaller market share unless Apple can magically beat the GPU giants at their own game without making something super hot and without a huge power draw, that is also so much easier to program for.

    Oh goody this again. Happens every time.

    "Apple is not going to beat out market leaders."

    What evidence do you have that would prove Apple is on a path to beat NVIDIA or AMD?

    Both of which have strong roadmaps for development, access to the latest nodes from TSMC/Samsung, industry wide support and well optimized drivers. 

    It's not impossible and perhaps they will, but so far you haven't provided even the slightest bit of evidence to back up that claim.


    History repeating. You'd think after decades of being disproven people would stop with this "Apple doesn't know what they're doing" crap.

    As far as evidence:

    iPhone
    iPad
    Services
    Apple Watch
    AirPods

    Now what evidence DO YOU have that Apple will fail in a new venture and doesn't know what they're doing?


    avon b7 said:
    Beats said:
    jonahlee said:
    If they come up with some new port it instead of PCI it will likely only mean Apple cards work with it, and it will mean the expansion is a joke. And while an Apple GPU might be decent, it is not going to compete with AMD and NVIDIA mainly because of driver support, because of the smaller market share unless Apple can magically beat the GPU giants at their own game without making something super hot and without a huge power draw, that is also so much easier to program for.

    Oh goody this again. Happens every time.

    "Apple is not going to beat out market leaders."

    You clearly didn't understand what he was saying and just went on the defensive. 

    His comments are more than valid given current market realities and Apple's own experience and history. 

    We don't know where this will go. Many things are possible and nothing is sure to be a guaranteed success.





    Apple's own experience and history supports MY argument. Oh the irony.

    "current market realities"

    Since when in the history of Apple, did they launch a new product with market advantages?
    Blackberry/Nokia/Motorola damn near owned the phone market before Apple blew them away. We heard THE SAME EXACT reasoning as yours from Apple haters.
    Microsoft had a decade of tablets and owned the tablet market before iPad.  We heard THE SAME EXACT reasoning as yours from Apple haters.
    Apple Watch had ZERO marketshare and the Swiss undustry laughed at Apple. A few years later and Apple is #1.
    Services received the same crap reasoning and last I heard, they weren't doing too bad. Remember Deezer?

    I still remember all the tech journalists writing Apple inventions off as "toys" or irrelevant devices because Apple had no experience.

    "Apple's own experience and history. "
    Apple's own experience and history says otherwise.


    I didn't bother mentioning the other guy who replied to me because he's still stuck in 1985. Do you really think Apple does not know what they're doing and did not plan this for years? Possibly a decade? Or should Apple fire their entire staff and hire the brilliant posters of AI?
    jdb8167watto_cobra
     2Likes 0Dislikes 0Informatives
  • Reply 72 of 86
    Ha ha ha ha ha. I love the constant, oh look the iphone and ipad beat all their competition so anything Apple develops will be so much more powerful than pro machines? Have you tried any pro apps on an iPad? Any art programs or photoshop. First off they crash all the damn time, losing so much work that I have given up even trying to use them. So if a pro machine is going to pro software like that forget it!

    And apple has a long history of screwing up in the pro market. The years between MacPros just being a small portion, but their constant dropping of their pro software has affected my industry often, and they have certainly made some lemon machines too. And look at Final Cut Pro X. Yes in some ways it is very powerful and in others it is completely broken, with what looks like no chance of it fixing it's many faults, you should see the list of bugs we hit on my last big job using it.

    And all the talk of oh they can make something powerful enough and it will be enough. If they make their own graphics chip on a SOC and it isn't as powerful and it is orders of magnitude less powerful than what AMD and NVIDIA are doing for the same price, they are going to have huge problems in the pro space. 

    And yes Apple beat NVIDIA in the mobile space with it's graphics, destroyed them completely, but the mobile space is very much a different beast. And yes experience does count for something. If experience building didn't count, Apple's car would have been out ages ago.

    And I also love how it always seems to come down to Apple haters. Even as an Apple fan you are allowed to question them. I got my first Apple product 36 years ago and have been with them every since, but it doesn't mean that I won't question their decisions.
    KITAmuthuk_vanalingam
     2Likes 0Dislikes 0Informatives
  • Reply 73 of 86
    melgrossmelgross Posts: 33,712member
    wizard69 said:
    melgross said:
    JinTech said:
    Or you could read it as that this is what the Apple Silicon processor is directly compatible with. It doesn't say they will not support third-party. I could see Apple using their own GPU for primary tasks but third-party for more beefy tasks. Do we really think that Apple could compete with a GPU like the AMD Radeon Pro Vega II for professionals?
    FTA: "There is no indication in support documentation that Apple will discontinue support for AMD GPUs for Intel Macs in future versions of macOS, but the statement above may also suggest that there may yet be an avenue for third-party PCI-E GPU support going forward."
    You have to admit though, that the chart is provocative. It could have showed an Apple GPU or a theirs party GPU, but it didn’t.
    It is, yes.
    Maybe!   What I've been noticing is that there are a lot of people that read into Apple's documentation things that Apple has never said.   The common one I've sen across the net is that people are quoting a statement by Apple that there will be i86 Macs for a long time to come.   the problem is the sentence clearly says that Apple intends to support MACOS on i86 for a long time to come.

    In this case we have people seemingly taking a statement that says nothing about discreet GPU's as a statement that Apple has no intention to use AMD GPU's in the future.   It doesn't matter what Apple is doing in this case because it is a stretch to imply that AMD is out of the picture.   In fact I just spent some time listening to a WWDC blurb on ray tracing support in Metal, this implies to me that they have plans that will likely require external hardware to achieve max performance.   I say that because I don't see Apple having enough die space to be able to offer up a high performance ray tracing support.   Now Apple could offer an external GPU or even a MCM but even the MCM could be supporting a third party chip.   There are all sorts of options here.

    I still think that if apple wants to offer a high performance Mac Pro, a third party GPU (or external compute chip) is a must for the next few years.   I just see them needing a couple of years to offer up an ARM based chip that is similar in performance to Fijitsu's chip.   Even then an accelerator chip can still make sense no matter how fast your CPU is.   So instead of  dropping AMD, I rather see them increasing their partnership with AMD.   In this case that might mean new technology in Mac compute accelerations.   Think a fabric connection to an external CDNA chip.
    Usually, when a company has some sort of roadmap, which, in a sense was what Apple showed their, they show options, and when and where those i[turns would be available. So for Apple to show AMD GPUs for Intel Macs, but only Apple GPUs for ARM Macs is indeed provocative. Yes, to be sure, it doesn’t rule AMD out completely, in the future, at least. But it doesn’t show them as an alternative either, and Apple said that their GPU was, and would be very powerful. And as I mentioned earlier, Apple’s new deal with Imagination is being ignored by every commentator I’ve read anywhere. It shouldn’t be. Imagination had announced some pretty interesting specifications shortly befor Apple announced the deal.
    watto_cobra
     1Like 0Dislikes 0Informatives
  • Reply 74 of 86
    corp1corp1 Posts: 110member
    I won't be surprised if Apple Silicon is the end of eGPUs.

    Then again I'm mainly interested in an eGPU for x86 gaming via Boot Camp, which is dying anyway.

    (Though I would like to see Apple Silicon support hardware-accelerated x86 emulation someday.)


     0Likes 0Dislikes 0Informatives
  • Reply 75 of 86
    DuhSesameduhsesame Posts: 1,278member
    KITA said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation, it's actually extremely overpriced and underpowered. It might only be good in certain applications optimized under macOS, outside of that, nope.

    Repost from another thread:

    After Effects



    While Macs often perform fairly well, in After Effects there is simply no argument that a PC workstation is both faster and significantly less expensive. Compared to the $20k Mac Pro we tested, a $4k PC using an Intel Core i9 9900K and NVIDIA GeForce 2080 Ti ended up being about 5% faster overall, while a $5.5k PC using an AMD Threadripper 3960X is about 18% faster. Even compared to the much better priced iMac Pro, a PC that costs $1K less is going to be about 35% faster.

    What this means is that you can get the same or faster performance from a properly configured PC at a quarter (or less) the cost of a Mac Pro. With an application like After Effects where you can distribute renders across multiple machines using plugins like BG Render Max or RenderGarden, this isn't even about just getting similar performance at a lower price point. You can decrease your render times by 4-5x by purchasing multiple PCs and using network rendering to split up the work between each system. This only improves render performance (not live playback), but also gives you a ton of flexibility to have renders running on multiple machines while simultaneously working on other comps on your primary workstation.

    Or, you can simply save that $15k and spend it on a new car, home remodel, or a really, really fancy vacation.

    https://www.pugetsystems.com/labs/articles/After-Effects-performance-PC-Workstation-vs-Mac-Pro-2019-1718/

    Photoshop


    Since Photoshop is largely unable to take advantage of higher CPU core counts, there often isn't much of a difference between most modern mid/high-end CPUs - and that applies for a Mac just as much as it does for a PC workstation. Overall, if Photoshop is your primary concern, you can get about 10% higher performance from one of our $4,200 Puget Systems workstations with either an AMD Ryzen 3900X or Intel Core i9 9900K compared to the $19,599 Mac Pro (2019) we tested.

    Now, is 10% going to be a game-changer for your workflow? Probably not - it is right on the edge of what you might be able to notice in everyday work. The main takeaway here is not necessarily the performance alone, but rather how much you have to pay to get it. Even if you forget the Mac Pro and go with the much more reasonably priced iMac Pro, you are still likely to pay about twice the cost for equivalent performance.

    https://www.pugetsystems.com/labs/articles/Photoshop-performance-PC-Workstation-vs-Mac-Pro-2019-1716/

    Premiere Pro


    Since there are so many reasons why either a Mac or a PC may be right for you, we generally try to focus on the straight performance results and not tell you which you should purchase. But in this case, the Mac Pro is so underwhelming that it is hard to not simply say "Don't buy a Mac Pro for Premiere Pro".

    This isn't like our Photoshop testing where the Mac Pro was only a hair slower than a PC, or our After Effects testing where a PC can easily be 20% faster at a much lower cost. This time, we are talking a PC being up to 50% faster on average for 1/3 the cost. We understand that there is a lot of benefit to staying in the Apple ecosystem if you also have an iPhone, MacBook, etc., but that is a huge amount of performance and cost savings you will be giving up to get a Mac Pro.

    By skipping the Mac Pro and going with a PC, you could easily save $14,000 which could be used for a host of other things to improve your workflow. Maybe you can finally upgrade your reference monitor to a really nice Eizo or Flanders Scientific model. Or use it as an opportunity to move to a central NAS storage unit from LumaForge. Or just take a couple months off to recharge. And this isn't taking into account the amount of money you might be able to earn due to the higher performance of a PC.

    https://www.pugetsystems.com/labs/articles/Premiere-Pro-performance-PC-Workstation-vs-Mac-Pro-2019-1719/

    Ridiculous.  They aren't even on the same ecosystem.  Picking their advantages to compare its weaknesses isn't going to change their user's mind.KITA said:
    KITA said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation,


    Of course you don’t have to spend $14k on one. I purchased one for my studio for about $7500 including some aftermarket upgrades. I’ve owned many Macs since the time they used to be beige and the new MacPro is basically the absolute best Mac ever built. The performance is top notch (it’s not always just about benchmarks) and there are a slew of other reasons you failed to even include in your “analysis” like noise level(which is basically 0 ), top notch components, build quality and materials, overall design (the slip cover chassis is brilliant), expansion options and aesthetics.
    "Top notch performance", but it's getting far out performed by parts in systems at lower prices in most cases. developer transition kit disassembleOther workstations are expandable (and actually work with off the shelf components), use high quality parts and are also near silent. Of course, there are different use cases for different machines, if you require software that's only on macOS, alas, you have no other options.
    Oh that sure sounds easy to you.  Who knows, a poster that's smarter than builders/insiders?  You sure pretend these blowers on graphics cards doesn't exist or inevitably interrupted the airflow.

    I'm going to ignore this guy for now on.
    edited August 2020
    fastasleep
     1Like 0Dislikes 0Informatives
  • Reply 76 of 86
    DuhSesameduhsesame Posts: 1,278member
    KITA said:
    rob53 said:
    The A12Z Bionic is up to 8 GPU cores. The Most powerful and expensive GPUs have cores in the thousands. What would it take for Apple to create its own separate 500 core GPU SoC or maybe only a 100 core GPU with the ability to use several of them in a blade setup. There's nothing stopping Apple, other than patents, from making whatever they want to any way they want to. Look at the Mac Pro. It's a fantastic workstation. 
    The Mac Pro is not a fantastic workstation,


    Of course you don’t have to spend $14k on one. I purchased one for my studio for about $7500 including some aftermarket upgrades. I’ve owned many Macs since the time they used to be beige and the new MacPro is basically the absolute best Mac ever built. The performance is top notch (it’s not always just about benchmarks) and there are a slew of other reasons you failed to even include in your “analysis” like noise level(which is basically 0 ), top notch components, build quality and materials, overall design (the slip cover chassis is brilliant), expansion options and aesthetics.
    too much for him to do his analysis.  Judging from his words, he doesn't even know the workflows for organizations/studios.
     0Likes 0Dislikes 0Informatives
  • Reply 77 of 86
    DuhSesameduhsesame Posts: 1,278member
    melgross said:
    wizard69 said:
    melgross said:
    JinTech said:
    Or you could read it as that this is what the Apple Silicon processor is directly compatible with. It doesn't say they will not support third-party. I could see Apple using their own GPU for primary tasks but third-party for more beefy tasks. Do we really think that Apple could compete with a GPU like the AMD Radeon Pro Vega II for professionals?
    FTA: "There is no indication in support documentation that Apple will discontinue support for AMD GPUs for Intel Macs in future versions of macOS, but the statement above may also suggest that there may yet be an avenue for third-party PCI-E GPU support going forward."
    You have to admit though, that the chart is provocative. It could have showed an Apple GPU or a theirs party GPU, but it didn’t.
    It is, yes.
    Maybe!   What I've been noticing is that there are a lot of people that read into Apple's documentation things that Apple has never said.   The common one I've sen across the net is that people are quoting a statement by Apple that there will be i86 Macs for a long time to come.   the problem is the sentence clearly says that Apple intends to support MACOS on i86 for a long time to come.

    In this case we have people seemingly taking a statement that says nothing about discreet GPU's as a statement that Apple has no intention to use AMD GPU's in the future.   It doesn't matter what Apple is doing in this case because it is a stretch to imply that AMD is out of the picture.   In fact I just spent some time listening to a WWDC blurb on ray tracing support in Metal, this implies to me that they have plans that will likely require external hardware to achieve max performance.   I say that because I don't see Apple having enough die space to be able to offer up a high performance ray tracing support.   Now Apple could offer an external GPU or even a MCM but even the MCM could be supporting a third party chip.   There are all sorts of options here.

    I still think that if apple wants to offer a high performance Mac Pro, a third party GPU (or external compute chip) is a must for the next few years.   I just see them needing a couple of years to offer up an ARM based chip that is similar in performance to Fijitsu's chip.   Even then an accelerator chip can still make sense no matter how fast your CPU is.   So instead of  dropping AMD, I rather see them increasing their partnership with AMD.   In this case that might mean new technology in Mac compute accelerations.   Think a fabric connection to an external CDNA chip.
    Usually, when a company has some sort of roadmap, which, in a sense was what Apple showed their, they show options, and when and where those i[turns would be available. So for Apple to show AMD GPUs for Intel Macs, but only Apple GPUs for ARM Macs is indeed provocative. Yes, to be sure, it doesn’t rule AMD out completely, in the future, at least. But it doesn’t show them as an alternative either, and Apple said that their GPU was, and would be very powerful. And as I mentioned earlier, Apple’s new deal with Imagination is being ignored by every commentator I’ve read anywhere. It shouldn’t be. Imagination had announced some pretty interesting specifications shortly befor Apple announced the deal.
    I saw one (Adobe) programmer got pissed & then buying a NVIDIA laptop for that.  He has been doing professional graphics for a hobby, seeing rumors that "Apple is killing eGPU is what leads him to that decision, despite he didn't like Windows that much.

    There's more than that like the concern over closing the ecosystem to be sure.  I don't know how to response as the "future" is completely in blank.
    edited August 2020
     0Likes 0Dislikes 0Informatives
  • Reply 78 of 86
    DuhSesameduhsesame Posts: 1,278member
    melgross said:
    I believe that Apple can compete with high end GPUs. But not on the SoC. Can they compete with lower end GPUs on the SoC? Certainly. Intel is getting to that point now, and so can Apple.

    the other things being ignored is the new deal between Apple and Imagination. What that will lead to, we don’t yet know.
    Have no doubt it could rule out mobile and even consumer desktops.  It all comes down how their workstations were heading, the wildest prediction I've heard is Apple to completely make their own cards & eGPUs in 2022.  This guy is a radical predictor but often gets basics right (said the same about CPU as early as 2015 which everyone think he's mad, turns out just right around five years).
     0Likes 0Dislikes 0Informatives
  • Reply 79 of 86
    DuhSesameduhsesame Posts: 1,278member
    Marvin said:
    wizard69 said:
    What I've been noticing is that there are a lot of people that read into Apple's documentation things that Apple has never said.
    The document here was presented in a video where the presenter said that Intel Macs have Intel, Nvidia or AMD GPUs, Apple Silicon Macs will have Apple GPUs. I'm sure he would have said Apple Silicon Macs will have Apple GPUs and other GPUs if that was going to be the case because it's a video for developers telling them how they should design their software.
    In this case we have people seemingly taking a statement that says nothing about discreet GPU's as a statement that Apple has no intention to use AMD GPU's in the future.   It doesn't matter what Apple is doing in this case because it is a stretch to imply that AMD is out of the picture.   In fact I just spent some time listening to a WWDC blurb on ray tracing support in Metal, this implies to me that they have plans that will likely require external hardware to achieve max performance.   I say that because I don't see Apple having enough die space to be able to offer up a high performance ray tracing support.   Now Apple could offer an external GPU or even a MCM but even the MCM could be supporting a third party chip.   There are all sorts of options here.
    There are options and there are ARM systems with PCIe support and 3rd party GPU support:

    https://www.anandtech.com/show/15733/ampere-emag-system-a-32core-arm64-workstation

    but 3rd party GPUs would only be needed if Apple's GPUs don't perform well enough and they seem to be suggesting they will. It's the same reason they are switching CPU.

    They are also tuning Metal to work best on their GPU hardware, which has a different rendering structure than other GPUs. Apple is sending a clear message here, that they can beat everyone in silicon design so they have no need to buy 3rd party hardware and resell it when then they can do a better job themselves.(1)
    I still think that if apple wants to offer a high performance Mac Pro, a third party GPU (or external compute chip) is a must for the next few years.   I just see them needing a couple of years to offer up an ARM based chip that is similar in performance to Fijitsu's chip.   Even then an accelerator chip can still make sense no matter how fast your CPU is.   So instead of  dropping AMD, I rather see them increasing their partnership with AMD.   In this case that might mean new technology in Mac compute accelerations.   Think a fabric connection to an external CDNA chip.
    The Afterburner card is a PCIe card so if they want to keep using that, they need some kind of interface for it. But with a custom GPU, they could build this into a higher-end package in the iMac Pro without needing a card or the iMac Pro can have an internal slot for it.

    I suspect they will leave the Mac Pro on Intel hardware. They should be able to match its performance in the iMac form factor and over time people will wonder why pay the Intel premium of thousands extra per CPU when a standard iMac can do the same job at a much better price.

    It's possible that higher-end AMD/Nvidia GPUs will outperform some of Apple's options if Apple sticks to lower power profiles but it doesn't really matter. (3) Once you have a GPU that is over 10TFLOPs, everything beyond that is gravy. Gravy is nice but optional and it doesn't matter as much for real-time processing as for offline processing for which multiple machines can be used.

    For hardware lineup, I'm expecting:
    - Macbook Air and mini 8-core (4-big, 4 small) CPU, 4TFLOP GPU (MBP level performance)
    - Macbook Pro 16-core (could be 12 big, 4 small), 8TFLOP GPU (iMac level performance)
    - iMac 32-core, 16TFLOP GPU (Mac Pro level performance), possibly with optional accelerators like Afterburner

    - Mac Pro continue to use Intel Xeon and 3rd party GPUs, updated every 2-3 years and probably removed from sale in 6-8 years

    If that's what they manage at launch, there's nothing in the roadmap that will need them to use 3rd party hardware any more but we'll only know for sure when they start shipping the first ARM Mac (likely Macbook Air) later in the year.
    1). Interesting.  I bet we'll see dedicated graphics card.  A guy I knew said we'll see their own eGPUs, and he's pretty good predict these things, what do you think?

    2). I was thinking about the same, wondering if it's possible.  The guy mentioned above think external solution will slowly replace internal slots as eGPUs are more flexible.  Though, both of us don't think that'll replace the Mac Pro.

    3). In fact, I believe Apple could create their own niche instead playing along with rest of the PC industry.  Sure they can beat an i9 with ease (a leak suggests the first ASi is on-par with i7-10700K, at 1/3 of its consumption), but with that level of performance, it's not a requirement.

    To give an example, let's say a 14" aimed between both Ultrabooks and High-performance notebooks, with a consumption that's lower than both of them, or a fanless 13" that closely matched or exceeds all other 13".

    4). Agreed, though I think two extra cores is likely.
     0Likes 0Dislikes 0Informatives
  • Reply 80 of 86
    I am really interested in seeing what Apple can do with a GPU, as I am not sure that the iPad GPU really scales like an NVIDIA or AMD GPU. It will be amazing if it does though, and I do find some hope in them releasing an iMac with 16GB of Graphics Memory, though hate that they didn't also upgrade the iMac Pro with a newer gen CPU and a GPU that isn't 3 years old.
     0Likes 0Dislikes 0Informatives
Sign In or Register to comment.