Blender update adds support for Metal GPU rendering on Mac

Posted:
in macOS edited March 10
A new version of 3D creation software Blender has introduced Metal GPU rendering for Mac devices equipped with Apple Silicon chips or AMD graphics cards.

Blender 3.1
Blender 3.1


The addition of a Metal GPU backend, which was contributed by Apple, allows Blender to take advantage of the built-in graphics processing unit in Apple's M1, M1 Pro, M1 Max, and M1 Ultra chipsets. The Metal GPU rendering is also available on Macs with an AMD graphics card.

According to Blender, the new Metal GPU support allows for rendering times that are up to 2 times quicker. The app itself also runs faster on M1 Macs because the app now have direct access to the GPU.

In addition, the Blender 3.1 update that includes the new Metal GPU backend also features memory usage improvements, ray tracing upgrades, and better indexing on asset browser libraries for faster loading times, among other features.

The addition is noteworthy because Blender has yet to offer GPU rendering on Macs since Apple ended support for OpenCL. Back in October, Apple, however, joined the Blender Development Fund, which allowed macOS to become a supported platform again.

Mac users wanting to take advantage of the new Metal rendering need to have a device running at least macOS Monterey 12.3 or later.

Read on AppleInsider
anushreejit
«1

Comments

  • Reply 1 of 24
    Here are two predictions based on this news:
    1. Blender will render at around 1/10th the speed on the top end Mac Studio compared to a top end GPU.
    2. Apple fans will blame Blender's developers because they don't know how to use Metal correctly.
  • Reply 2 of 24
    auxioauxio Posts: 2,518member
    Here are two predictions based on this news:
    1. Blender will render at around 1/10th the speed on the top end Mac Studio compared to a top end GPU.
    2. Apple fans will blame Blender's developers because they don't know how to use Metal correctly.
    Did you read the article?  Or are you just paid to post negative comments?
    The addition of a Metal GPU backend, which was contributed by Apple
    Apple themselves wrote the Metal backend for Blender.

    Having actually written shaders in both GLSL and MSL, I can say that Metal is an absolute pleasure to work with by comparison (passing data from CPU to GPU is much easier).  The performance in my experience was about the same on equivalent GPU hardware, but then I wasn't writing a full blown 3D design application.  I'll be interested to see a comparison for Blender.
    tmayStrangeDaysfastasleepMplsPmuthuk_vanalingammtanikawatto_cobra
  • Reply 3 of 24
    Here are two predictions based on this news:
    1. Blender will render at around 1/10th the speed on the top end Mac Studio compared to a top end GPU.
    2. Apple fans will blame Blender's developers because they don't know how to use Metal correctly.
    You simply go into preferences and turn it on.
    Even you could do that...
    tmayuraharaStrangeDayswatto_cobra
  • Reply 4 of 24
    MarvinMarvin Posts: 14,836moderator
    The addition of a Metal GPU backend, which was contributed by Apple, allows Blender to take advantage of the built-in graphics processing unit in Apple's M1, M1 Pro, M1 Max, and M1 Ultra chipsets. The Metal GPU rendering is also available on Macs with an AMD graphics card.

    According to Blender, the new Metal GPU support allows for rendering times that are up to 2 times quicker. The app itself also runs faster on M1 Macs because the app now have direct access to the GPU.
    Metal also offers more than 2x speedup for faster models, 2x is for M1 GPU vs M1 CPU. There are benchmarks here that render 3 scenes:

    Open Data Benchmarks

    The Metal implementation hasn't been tuned for performance yet:

    https://developer.blender.org/T92212

    M1 CPU = 86
    M1 GPU = 178
    M1 Max CPU = 196
    M1 Max GPU = 692
    ThreadRipper 3970X (32-core) = 840
    AMD Radeon Pro W6900X = 2124 (Mac Pro Metal)
    3060 laptop = 2303
    3070 laptop = 3010
    3090 & 3080ti = ~5900

    The Nvidia GPUs are using Optix for optimal raytracing:

    https://developer.nvidia.com/rtx/ray-tracing/optix

    The W6900X result is interesting as that's using Metal just like the M1 but is 3x faster than the Max GPU. Apple said the M1 Ultra was 80% faster than the W6900X in their GPU test:



    They will need to do some performance tuning on the software. However, Metal still gives a 3.5x speedup on M1 Max GPU over the CPU.

    Apple may not be able to match the RTX GPUs without hardware raytracing, it's not really expected either. For this kind of workflow, there would be no harm in buying NVidia PCs for rendering, the 3060 laptops are cheap around $1000 each and there were even crypto farms using these. The Mac can be used as the workstation and send the final renders or animations to the PCs.

    If Apple can get the software tuned so that the Ultra in the Mac Studio matches or exceeds the W6900X, that would be a good result. Future versions of Apple Silicon can include hardware raytracing units if they see this extra 2-3x speedup as worthwhile, I wouldn't expect it before M3. The software needs to be optimal before they can start adding hardware accelerators because they need to know where the bottlenecks are. Different raytracing hardware implementations can have very different speedups e.g allowing for motion blur.

    It's good to see Apple directly implementing Metal support in 3rd party software. Hiring a few people to work on it is a small cost for them but it adds a lot for creative users on Macs.
    edited March 10 aderuttermangakattenfastasleepMplsPcgWerkswatto_cobra
  • Reply 5 of 24
    StrangeDaysStrangeDays Posts: 12,259member
    But will it blend?
    watto_cobra
  • Reply 6 of 24
    MplsPMplsP Posts: 3,707member
    But will it blend?
    Yes, but it will only blend metal
    cgWerkswatto_cobra
  • Reply 7 of 24
    cgWerkscgWerks Posts: 2,843member
    Marvin said:
    The W6900X result is interesting as that's using Metal just like the M1 but is 3x faster than the Max GPU. Apple said the M1 Ultra was 80% faster than the W6900X in their GPU test ...Future versions of Apple Silicon can include hardware raytracing units if they see this extra 2-3x speedup as worthwhile, I wouldn't expect it before M3.
    Yeah, that is kind of odd. That 80% faster... at what? Is that running some benchmark?

    Apple will need to add that kind of hardware if the GPUs don't have it to be competitive (for games as well). I don't know why they don't just add eGPU support and some drivers. Problem solved!

    The main issue I see with this (though inevitable, I guess) is that if each round of M chips adds a bunch of specialized chips, it becomes a bit of a nightmare keeping track of which generation you need to accomplish what, especially since the performance of that thing can be so night & day. I'll be bummed if I buy a M1 Ultra Studio and then they add a bunch of raytracing stuff in the M2 or M3 version. But, I guess that's the way this stuff works now (same if you bought a pre-RTX 30xx series and thought you were going to play some retraced Minecraft).
    watto_cobra
  • Reply 8 of 24
    mpantonempantone Posts: 1,934member
    cgWerks said:
    Marvin said:
    The W6900X result is interesting as that's using Metal just like the M1 but is 3x faster than the Max GPU. Apple said the M1 Ultra was 80% faster than the W6900X in their GPU test ...Future versions of Apple Silicon can include hardware raytracing units if they see this extra 2-3x speedup as worthwhile, I wouldn't expect it before M3.
    The main issue I see with this (though inevitable, I guess) is that if each round of M chips adds a bunch of specialized chips, it becomes a bit of a nightmare keeping track of which generation you need to accomplish what, especially since the performance of that thing can be so night & day. I'll be bummed if I buy a M1 Ultra Studio and then they add a bunch of raytracing stuff in the M2 or M3 version. But, I guess that's the way this stuff works now (same if you bought a pre-RTX 30xx series and thought you were going to play some retraced Minecraft).
    It's already like that with Macs.

    I have a Mac mini 2018. The integrated graphics offers certain capabilities. If I turn on my Sonnet eGPU with a Sapphire Pulse Radeon RX 580 card, it gains a few more.

    As for the raytracing features at least on Windows PC games, they are only available when the hardware supports it. On Cyperpunk 2077, you'll only see the raytracing options in the game's control panel if your GPU supports it. It won't default to software RT for unsupported cards. Raytracing on my 3080 Ti? Yes, the option is there. Raytracing on my RX 580? Nope, sorry, completely invisible like it doesn't exist.

    This feature discrepancy isn't specific to Macs or even computers in general. It pretty much pertains to everything: cars, sewing machines, even toilets. You can pick up a cheap Kohler toilet that does nothing but flushes or you can opt for the spendy Toto Washlet with heated seat, automatic lid, warm water irrigation, etc. Trust me, if you wake up in the middle of the night to take a leak, you'll know where the toilet with the heated seat is without looking at a database or Excel spreadsheet.
    edited March 11 cgWerkswatto_cobra
  • Reply 9 of 24
    blastdoorblastdoor Posts: 2,842member
    cgWerks said:
    Marvin said:
    The W6900X result is interesting as that's using Metal just like the M1 but is 3x faster than the Max GPU. Apple said the M1 Ultra was 80% faster than the W6900X in their GPU test ...Future versions of Apple Silicon can include hardware raytracing units if they see this extra 2-3x speedup as worthwhile, I wouldn't expect it before M3.
    Yeah, that is kind of odd. That 80% faster... at what? Is that running some benchmark?

    Apple will need to add that kind of hardware if the GPUs don't have it to be competitive (for games as well). I don't know why they don't just add eGPU support and some drivers. Problem solved!

    The main issue I see with this (though inevitable, I guess) is that if each round of M chips adds a bunch of specialized chips, it becomes a bit of a nightmare keeping track of which generation you need to accomplish what, especially since the performance of that thing can be so night & day. I'll be bummed if I buy a M1 Ultra Studio and then they add a bunch of raytracing stuff in the M2 or M3 version. But, I guess that's the way this stuff works now (same if you bought a pre-RTX 30xx series and thought you were going to play some retraced Minecraft).
    I guess the key is for Apple to provide APIs that make best use of whatever hardware is available and for developers to use those APIs. 
    watto_cobra
  • Reply 10 of 24
    This article says the speed improvements only work on Mac OS 12.3 but 12.3 isn't out yet. Is that a typing error or do we need to wait for the operating system update to take advantages of the speed enhancements. 
    watto_cobra
  • Reply 11 of 24
    Mike WuertheleMike Wuerthele Posts: 6,553administrator
    This article says the speed improvements only work on Mac OS 12.3 but 12.3 isn't out yet. Is that a typing error or do we need to wait for the operating system update to take advantages of the speed enhancements. 
    That's what the release said. 12.3.

    It should be out Tuesday or Wednesday.
    watto_cobra
  • Reply 12 of 24
    MarvinMarvin Posts: 14,836moderator
    cgWerks said:
    Marvin said:
    The W6900X result is interesting as that's using Metal just like the M1 but is 3x faster than the Max GPU. Apple said the M1 Ultra was 80% faster than the W6900X in their GPU test ...Future versions of Apple Silicon can include hardware raytracing units if they see this extra 2-3x speedup as worthwhile, I wouldn't expect it before M3.
    Yeah, that is kind of odd. That 80% faster... at what? Is that running some benchmark?
    It's probably a benchmark or workflow test of some kind, they didn't say what they used. There was an Affinity Photo test posted a while ago that showed the M1 Max ahead of the W6900X, this test would probably show the M1 Ultra 80% higher:

    https://appleinsider.com/articles/21/10/26/apples-m1-max-bests-amd-radeon-pro-w6900x-in-affinity-gpu-benchmark
    cgWerks said:
    Marvin said:
    The W6900X result is interesting as that's using Metal just like the M1 but is 3x faster than the Max GPU. Apple said the M1 Ultra was 80% faster than the W6900X in their GPU test ...Future versions of Apple Silicon can include hardware raytracing units if they see this extra 2-3x speedup as worthwhile, I wouldn't expect it before M3.
    I'll be bummed if I buy a M1 Ultra Studio and then they add a bunch of raytracing stuff in the M2 or M3 version.
    Companies usually like to leave some upgrades for selling new models. Nvidia could easily sell GPUs today with double/quadruple the RT cores but they put in enough to show the benefit and over time will increase the number. The 3080 has 68 RT cores vs 46 in the 2080.

    Apple has a lot of freedom to add whatever they want. They could even leave raytracing as a feature for the Mac Pro but it might be difficult to limit it to the Mac Pro if they are just using multiples of the Max chips. The W6900X in the current Mac Pro has hardware raytracing. It's a feature that only appeals to a small amount of users so it's not guaranteed Apple will add it on Apple Silicon but they have been doing work on the software side so they have likely considered it.
    This article says the speed improvements only work on Mac OS 12.3 but 12.3 isn't out yet. Is that a typing error or do we need to wait for the operating system update to take advantages of the speed enhancements. 
    It's 12.3 for Macs with AMD GPUs, 12.2 for Apple Silicon:

    https://www.blender.org/download/releases/3-1/

    It seems to work on earlier ones but there were bugs in the older drivers so these are the supported OS versions.
    edited March 11 cgWerksfastasleepwatto_cobra
  • Reply 13 of 24
    looplessloopless Posts: 270member
    Apple dropping support for openCL is the bad old Apple who wants to do their own thing and screw industry standards. Intel has developed a new standard for GPU computing called OneAPI which allows developers to write one code base and target all types of GPUs and FPGAs. Apple should swallow their pride and get on board.
    watto_cobra
  • Reply 14 of 24
    cgWerkscgWerks Posts: 2,843member
    mpantone said:
    It's already like that with Macs.

    I have a Mac mini 2018. The integrated graphics offers certain capabilities. If I turn on my Sonnet eGPU with a Sapphire Pulse Radeon RX 580 card, it gains a few more.

    As for the raytracing features at least on Windows PC games, they are only available when the hardware supports it. On Cyperpunk 2077, you'll only see the raytracing options in the game's control panel if your GPU supports it. It won't default to software RT for unsupported cards. Raytracing on my 3080 Ti? Yes, the option is there. Raytracing on my RX 580? Nope, sorry, completely invisible like it doesn't exist.

    This feature discrepancy isn't specific to Macs or even computers in general. It pretty much pertains to everything: cars, sewing machines, even toilets. You can pick up a cheap Kohler toilet that does nothing but flushes or you can opt for the spendy Toto Washlet with heated seat, automatic lid, warm water irrigation, etc. Trust me, if you wake up in the middle of the night to take a leak, you'll know where the toilet with the heated seat is without looking at a database or Excel spreadsheet.
    Heh, yeah, technology advancement in general, I guess. I'm just bummed as this seems to be what I really wanted, and now I'm going, Doh! Maybe I need to wait until Apple decides where they are going with GPUs, as I want to use it with CAD/3D (and I'm not sure it is currently up to the task, depending on what you read).

    I think my point was more that for quite a while, a computer was largely a computer, across the board. I have a 2018 mini as well, and if I'm a general user, it is fine as is. I added an eGPU as well. If I need more power, I can buy an even better GPU (w/ raytracing support if I want to play some Minecraft RTX, for example). Now, that isn't the case any longer. We'll basically have to wait and see how these perform in the software they use. It's kind of hard to guess, and you can't fix it with additional purchases.

    blastdoor said:
    I guess the key is for Apple to provide APIs that make best use of whatever hardware is available and for developers to use those APIs. 
    Yeah, the problem is (which this thread made me realize), it is kind of apples and oranges now (pardon the pun). We can see how much 'on paper' GPU power it has, but if it doesn't have the right feature-set to do something or other, it might be a total dog, even WITH code optimization. For example, no matter how optimized they make video encoding on the x86, it is going to be many times slower than for the person who has a machine with the T2.


    Marvin said:
    Apple has a lot of freedom to add whatever they want. They could even leave raytracing as a feature for the Mac Pro but it might be difficult to limit it to the Mac Pro if they are just using multiples of the Max chips. The W6900X in the current Mac Pro has hardware raytracing. It's a feature that only appeals to a small amount of users so it's not guaranteed Apple will add it on Apple Silicon but they have been doing work on the software side so they have likely considered it.
    Yeah, I really hope they add back support for eGPUs and the AMD drivers on Apple Silicon. I'm not sure I trust Apple to pick-up all these edge-cases. But, a lot of Mac users are going to want those edge-cases sometimes. Gamers are a good example. While I might be able to make some game run directly, it would be nice to know I could add a GPU with the right capabilities if needed. Same for CAD/3D, etc.
    watto_cobra
  • Reply 15 of 24
    fastasleepfastasleep Posts: 6,177member
    loopless said:
    Apple dropping support for openCL is the bad old Apple who wants to do their own thing and screw industry standards. Intel has developed a new standard for GPU computing called OneAPI which allows developers to write one code base and target all types of GPUs and FPGAs. Apple should swallow their pride and get on board.
    Yeah, I'm sure they'll get right on that. 🙄
    cornchipwatto_cobra
  • Reply 16 of 24
    cornchipcornchip Posts: 1,916member
    I don’t use blender that much anymore, but this is awesome news. Man, seems like people have been talking about blender three since I started using it back in 05! 
    watto_cobra
  • Reply 17 of 24
    cgWerkscgWerks Posts: 2,843member
    loopless said:
    Apple dropping support for openCL is the bad old Apple who wants to do their own thing and screw industry standards. Intel has developed a new standard for GPU computing called OneAPI which allows developers to write one code base and target all types of GPUs and FPGAs. Apple should swallow their pride and get on board.
    Yeah, I'm sure they'll get right on that. 🙄
    Yeah, though I understand the sentiment. I like it when Apple comes up with great, new beneficial technologies. But it sucks for many of us when they cut out technologies 99% of the world is still using. OpenGL/CL pretty much is the 3D/CAD world and more (crypto-mining, for example). I hope Apple pulls enough developers to ALSO add Metal support, but I doubt the industry is going to go, 'hey, look how great Apple's Metal is, let's all convert to Metal.' Which basically means Apple users begging companies for Apple support again (something we enjoyed not *having* to do for a couple decades).
    watto_cobra
  • Reply 18 of 24
    cgWerkscgWerks Posts: 2,843member
    cornchip said:
    I don’t use blender that much anymore, but this is awesome news. Man, seems like people have been talking about blender three since I started using it back in 05! 
    Yeah, I need to install it and start learning. I mostly ignored Blender back when I was doing that kind of work, because it was didn't have a ton of features and the interface was poor, etc. I simply had access to way superior stuff, and Blender was the 'poor person's' 3D app.

    Now, it seems Blender is nearly (or more) capable than most 3D packages on the market, even if it has various shortcomings in different ways (which I admit I don't even understand any longer). It is kind of hard to ignore, especially since I can just download it and get going.

    It also makes me concerned about the future of certain packages. For example, my 'stomping ground' Electric Image  Animation System is working on a new version (and has been for quite some time now... will it get released this year?!), but hitting the current market with competition like Blender for free is going to be way harder than the past when competing against a few other multi-thousand-dollar heavyweights. EIAS's claim-to-fame was always having a beautiful renderer that could handle seemingly impossibly huge scenes and still render fast. Maybe that's still a differentiator? But, with all these apps and plugin renderers, I just don't know any more.

    But, I don't think I can ignore Blender any longer.
    watto_cobra
  • Reply 19 of 24
    fastasleepfastasleep Posts: 6,177member
    cgWerks said:
    loopless said:
    Apple dropping support for openCL is the bad old Apple who wants to do their own thing and screw industry standards. Intel has developed a new standard for GPU computing called OneAPI which allows developers to write one code base and target all types of GPUs and FPGAs. Apple should swallow their pride and get on board.
    Yeah, I'm sure they'll get right on that. 🙄
    Yeah, though I understand the sentiment. I like it when Apple comes up with great, new beneficial technologies. But it sucks for many of us when they cut out technologies 99% of the world is still using. OpenGL/CL pretty much is the 3D/CAD world and more (crypto-mining, for example). I hope Apple pulls enough developers to ALSO add Metal support, but I doubt the industry is going to go, 'hey, look how great Apple's Metal is, let's all convert to Metal.' Which basically means Apple users begging companies for Apple support again (something we enjoyed not *having* to do for a couple decades).
    Not sure what you mean — software companies have to and/or are working to support Metal if they make such software for macOS. Regardless, 99% of the world isn't using OneAPI, nor do I see Apple supporting it for any reason, which is what I was responding to.
    watto_cobra
  • Reply 20 of 24
    cgWerkscgWerks Posts: 2,843member
    fastasleep said:
    Not sure what you mean — software companies have to and/or are working to support Metal if they make such software for macOS. Regardless, 99% of the world isn't using OneAPI, nor do I see Apple supporting it for any reason, which is what I was responding to.
    I just meant that at least with Intel Macs, it didn't matter if they made it for MacOS or not, we could run it anyway. Maybe Windows emulation will go better than I think at this point, but as much as I like this transition, we're now back in the pre-Intel days in terms of software support and begging companies to make a Mac version.

    Sorry, missed the OneAPI aspect though. I don't think that will catch on either, if that's what you likely meant.
    edited March 27
Sign In or Register to comment.