Apple's Mac Studio launches with new M1 Ultra chip in a compact package

1234568»

Comments

  • Reply 141 of 155
    melgrossmelgross Posts: 33,510member
    @melgross ;
    Clearly you didn't read my post carefully or the points flew over your head.

    Developers struggle to make effective use of threading in applications and even when they do there's a hug drop off in effectiveness after a few threads. I'd like to know what current workflow can keep 20 CPU cores busy on a consistent basis. I'm not talking about CPU rendering, it's 2022 most freelancers like me are GPU rendering.

    Apple has offloaded much of the work of the CPU onto fixed function units on the SOC like compression and decompression and also machine learning so there's even less need for large numbers of CPU cores than an equivalent PC would require. Apple is clearly marketing the Studio to creators, their whole presentation contained hyperbole from creatives and someone running office applications will be best served by the Mac mini or a Macbook. Let's not pretend there's a huge science and engineering market for Apple, there isn't.

    As a Davinci Resolve user I know just how reliant the app is on the GPU, even with beefy GPUs a complex effect chain can bring Resolve to its knees while not touching the CPU. Resolve is probably the best optimised app for the M1 and it will expose the relatively weak GPUs as the main performance limitation. 3D work is even more biased towards GPU performance with interactive viewport performance and rendering reliant on GPUs. Even in 2022 apps like Cinema4d are one thread wonders when interacting. Houdini and Blender are much better with threading but the law of diminishing returns kicks in very quickly double the CPU cores does not result in double the performance even when cooking sims.

    I bet there will be post production facilities waiting for the Mac Pro just to get double the GPU cores and have even more CPU cores doing absolutely nothing. Apple's boast that the Studio is faster than the Intel Mac Pro's 5700 is really no boast at all, not in the real world not at this price point. The 5700 is equivalent in compute performance to an nVidia 1080Ti so not exactly cutting edge.

    I've owned several Mac Pros in the past and used them for 3-4 years as I've been able to upgrade the GPU. In 3-4 years the GPUs in this Mac Studio will look silly compared to the state of art and as such the Studio will get old very quickly and having just 64 GPU cores is quite an Achilles heel.

    Some real world benchmarks of the new Blender 3.1 put the relative GPU performance into perspective. Apple did the development so no one can say this is down to a poor implementation.

    M1 - 198
    Pro - 363
    Max - 702

    3090 - 5566
    1080Ti - 839

    If we're charitable and double the Max score to 1400 that puts the Ultra at around an nVidia 2060. If we double the score again to give an indication of what a Mac Pro might perform like that puts it below a 2080 or 3080 laptop performance.

    This just isn't up to the level required for 3D artists to jump back on board Macs.
    Your points didn’t go over my head, most are just wrong. Video does use as many CPU cores as there are available. That why there as so many high core count CPU’s in the first place, other that for servers. There are video tasks that go through the encoders/decoders, and the rest of the chip as well. 

    As far as engineering goes, you’re wrong there as well. NASA for example is mostly Mac, and has been for many years. Drug companies use Macs for their R&D programs. There are number OSs other examples. As Apple’s performance has v]come back up, we’ve been seeing more CAD/CAM appear, particularly in architecture. I’ve been using Bently CAM for many years, and now use Fusion 360.
    muthuk_vanalingamfastasleepwatto_cobra
  • Reply 142 of 155
    melgross said:

    Your points didn’t go over my head, most are just wrong. Video does use as many CPU cores as there are available. That why there as so many high core count CPU’s in the first place, other that for servers. There are video tasks that go through the encoders/decoders, and the rest of the chip as well. 

    OK so you are just clueless.
  • Reply 143 of 155
    cgWerkscgWerks Posts: 2,952member
    Depending on your productions needs the Mac Studio is either a bargain of a lifetime or not a particularly great deal. For video and 2d creative work I think you'd need your head seen to if you didn't but a Mac Studio either version will run rings around most PC boxes running Resolve. I can see many post production studios coming back to the Mac and the lower end Studio being the hot seller, I predict this model will fly off the shelves.

    Unfortunately for 3D artists like myself the GPUs are just too weak for rendering and there's no hardware raytracing and what's more disappointing Apple's own software acceleration structure isn't particularly great. There's still issues with the size of the kernel Metal is able to work with so it looks like it's going to be behind Optix/HIP in performance and features for a while.

    Towards the end of the year both nVidia and AMD will be dropping their next gen GPUs which are predicted to be of the order 2x (maybe even more) the performance of their current GPUs and that probably puts daylight between an M1 Mac Pro let alone the Studio. Leaks coming from the nVidia hack show that the GPUs will have double the VRAM so 48GB renders the unified memory advantage of the M1 moot. No one is going to try and render a 64GB 3d scene on a GPU as weak as the Studio's.

    For me as predominantly a 3D artist I can't see beyond a PC workstation but I will probably pick up a low end Studio for video and compositing to replace my iMac. I might wait until M2 to do that though.
    Thanks, this is useful for me to ponder. I had hoped it would be in the ball-park of a good gaming/3D PC, but there are some differences between AMD/nVidia features and this, as you point out (like ray-tracing hardware). I've seen some videos where people were having trouble in modelers on a M1 Max MBP, but figured that was more due to software not being optimized.

    I still haven't fully wrapped my head around the differences between CPU & GPU rendering. Back when I was doing it heavily, it was all CPU rendering and the GPU was for visualization while working on the project (the more GPU, the more realistic and quickly the scene was to work on). Rendering divided among the cores, or even over the network to as much hardware as one had to throw at it. I know there are now GPU renderers (final render, not just preview), but are they the norm? In certain industries?

    The problem on the CAD/modeling side, is that some of the main modeling kernels aren't written for Apple Silicon, so all the apps that depend on them won't run or will be in Rosetta2. I noticed Vectorworks prominently featured, and have looked at their package a bit (for a job I interviewed for). I'd love to get into that, and it looks quite good.

    But absolutely, for video editing or stuff like that, this is a total no-brainer, and I doubt there is much of anything even close.

    mike1 said:
    I'd bet the new Pro will be the first with M2 chips, probably M2 Max and Ultra.
    Possible, but I doubt they'll make the Mac Pro an early model of the new chip. The M2 will be in the base machines until they scale it up eventually to the Mac Pro, I think. What they'll actually do to give the Mac Pro way more power than the Mac Studio, I can't even really imagine right now. But I don't think it will primarily be due to being a M2. Think of the M2 more like the incremental advances Apple makes on the iPhone each year... then just scale up more and more like we've seen from M1 -> M1 Pro -> M1 Max -> M1 Ultra.

    melgross said:
    Mitty said:
    This seems to be an awesome machine but it's complete overkill for my needs. I was really hoping for a revised Mac Mini. 
    I suspect that something like that might still happen. I know that John stated that there was one product left, the Mac Pro. But they discontinued the 27” iMacs, and it’s hard to believe they won’t have a 27” iMac to sell.
    That, and the mini will most likely get revised, it will just happen when they introduce the M2. Wouldn't that most likely be summer-fall? They can also always add a BTO type option with bigger screen and/or Mx Pro, so long as going to the Pro won't push it past cooling capability (or they can clock it down like they did with the 14" MBP with the Max).

    melgross said:
    ... I think we would be a bit disappointed if the M2 was just about 15% better, per core, than the M1. Two generations, along with the slight improvement in the process technology, could give a 25 to 35% improvement in performance, depending on how Apple decides to balance the performance/efficiency ratio. That could give a CPU number between 2160 and 2330. Which, if true, would run away with the core performance crown.
    That's likely the kind of thing we'll see going forward, though. Which makes me believe they have to do something other than put the M2 into the Mac Pro. There needs to be a Pro-level of the M1, which will then get updated the next year with the same treatment to the M2. Each generation will see some, like you say 15%, 20%, 30% performance improvement over the similar build from the previous generation. Then they'll keep stuffing specialized things in there, like the machine learning cores, or video processing. Maybe it will be ray-tracing stuff or who knows what in a future generation (which scares me a bit planning on the Mac Studio M1. My luck, I'll get that and then the M2 or M3 will add a bunch of heavily wanted/needed stuff for CAD/3D people and I'll be crying).

    That's the one downside of all this specialized hardware. When we were dealing with Intel chips and mainstream GPUs, you kind of knew what you were going to get. Then Apple added the M2, and overnight, *massive* improvements in video encoding. Now we're seeing that with these new Macs... jumps that make $30k machines obsolete. Apple seems good at that, so they'll probably keep focusing on special hardware for certain tasks/workflows. If you're unlucky enough to buy the generation right before they add it to something crucial to your workflow, you'll kind of *have* to sell and upgrade to the new one, as the difference will be that big.

    ... If the Mac Pro is 2x M1 Ultra 40 CPU cores and only 128 GPU cores looks even worse balanced, knowing the software I use on a daily basis I wouldn't be able to keep 20 CPU cores busy let alone 40 CPU cores because over the years developers have pushed so much work to the GPU. I think the M1 architecture kind of looks anachronistic and old school. Give me a Mac Pro with 20 cores and 256 GPU cores or even better 512 GPU cores. I just don't see the content creator being that well served by large numbers of CPU cores in 2022. ...
    Yeah, aside from just running a bunch of programs at once, or if you're into virtual environments. I think we're just getting the CPU cores along for the ride as the GPUs are being scaled up. The GPU is still where Apple needs to place more emphasis. We shouldn't be just roughly matching a high-end PC that has been around for years, with Apple's latest and greatest.

    I'm still wondering if we won't see eGPUs and AMD implemented in some way eventually? At least the eGPU, even if it is Apple Silicon based.

    ... 3D work is even more biased towards GPU performance with interactive viewport performance and rendering reliant on GPUs. Even in 2022 apps like Cinema4d are one thread wonders when interacting. Houdini and Blender are much better with threading but the law of diminishing returns kicks in very quickly double the CPU cores does not result in double the performance even when cooking sims. ...
    This just isn't up to the level required for 3D artists to jump back on board Macs.
    Yeah, this is the kind of stuff I need to figure out. I'm hoping more of it comes down to software and not some weakness in Apple's new platform. It would be sad if this is just primarily great for video editors and general use, but leaves the rest of us in PC-land again (or even worse than the last round of Intel Macs). If that's the case, then I won't want a Mac Studio, but just an M1 mini, and save my extra $ for a PC or a better GPU for my 2018 Intel mini.
    watto_cobraargonaut
  • Reply 144 of 155
    cgWerkscgWerks Posts: 2,952member
    melgross said:
    Your points didn’t go over my head, most are just wrong. Video does use as many CPU cores as there are available. That why there as so many high core count CPU’s in the first place, other that for servers. There are video tasks that go through the encoders/decoders, and the rest of the chip as well. 

    As far as engineering goes, you’re wrong there as well. NASA for example is mostly Mac, and has been for many years. Drug companies use Macs for their R&D programs. There are number OSs other examples. As Apple’s performance has v]come back up, we’ve been seeing more CAD/CAM appear, particularly in architecture. I’ve been using Bently CAM for many years, and now use Fusion 360.
    I think you guys are talking about different kinds of rendering.

    Anyway, yeah, the Vectorworks stuff looks impressive. If they can do it, then others should be able to as well. While pretty tight-lipped, the bit of interaction I've had with the devs of the upcoming Electric Image update, they seem impressed with the new Macs.

    The problem I see, is that many of the top CAD apps are pretty archaic, actually, and I just don't know if they'll get re-written for Apple silicon (Revit is a perfect example). Unless stuff like it gets running really well under emulation (which is maybe possible), Mac users will now be forced to decide if they can move to a different package (ex: Vectorworks), or have to buy an additional PC, move to a PC, etc. Whereas they weren't forced there when Apple was on Intel.
    watto_cobra
  • Reply 145 of 155
    @cgWerks ;

    Forgive me for not using quotes.

    For quite a large chunk of my career 3d rendering was always done on the CPU, I used to have a room stacked with cheap PC render slaves as a render farm to get stuff done. Most 3d artists had the same. Somewhere around 2014 - 2015 there was a buzz around a new renderer called Octane and it ran on the GPU rather than the CPU. It revolutionised 3d for us free lancers because we could replace a stack of PCs with a PC stuffed with GPUs instead. It was transformative and many of my fellow C4D users leapt on the bandwagon. GPUs are highly parallel devices which are good at doing lots of simple tasks together so it was a great fit for 3d rendering and proved an order of magnitude better than CPU rendering. The only downside all the models and textures had to fit in VRAM so there were scene complexity limitations. But these limitations didn't really affect most freelancers as our scenes are typically much less complex than studio productions. High end productions are typically still rendered on the CPU on farm because of RAM limitations. Hollywood scenes are hundreds of GB in size and way beyond the size able to be rendered on a GPU.

    I guess there's two types of GPU rendering to be aware of too, there's the real-time viewport/game engine style of rendering which is becoming more and more popular in Archviz rendering then there's the full photorealistic final frame rendering from Octane, Redshift and Cycles etc that calculates global illumination, accurate subsurface scattering and physically accurate materials etc. Both of these types of GPU rendering have their place and are set on a collision course a few years into the future as hardware gets more powerful.

    The M1 series GPUs don't have ray tracing hardware which is a bit of a disappointment TBH as it would improve performance dramatically. Architectural renderers like Twinmotion even without the ray tracing features of the PC version you can still produce architectural renderings of exceptional quality especially if you're going for a non-photoreal look which is popular in the Archviz game. This could be an area where the M1's large unified memory is a USP as it can hold scenes of much greater complexity than a typical consumer level PC GPU. You have to buy a Quadro to get 48 GB VRAM and connect two together for 96GB which is a very expensive alternative. I think the M1/2/3 Macs will find a place in 3d but not for photorealistic rendering with Octane and Redshift et al at least not for a while.

    Another area where the M series should excel is the real-time performance visuals sector, Macs were the hardware of choice at one stage due to the popularity of Quartz Composer, it was the core tech of a lot of software. Apple allowed Quartz to die and the market moved to PC but now I think Apple Silicon Macs could see this area grow again. That large unified memory, hardware codecs could revolutionise performance visuals or at least allow performers to use a tiny Studio box instead of a massive PC rig. I can't wait to see what the software developers do with Apple Silicon Macs in this area.

    Apple is behind on raw GPU performance and GPU features like ray tracing functions. It may come down the line but right now if you need brute strength then nVidia GPUs are way ahead. The excellent design of the M1 has allowed the flyweight GPUs to hit like a Middleweight in certain situations. There's no reason to rule out Apple punching even further above their weight with future hardware revisions. But definitely more GPU cores over CPU cores are needed for that, there's very few that need more than 10 cores. 20 and 40 CPU cores is a waste of silicon budget for me.

    What Apple has done exceptionally well is to make the M series the absolute king of video editing. The codecs on the SOC mean that the CPU doesn't get involved in decompressing the footage, the CPUs can concentrate on house keeping while the GPUs compute the colour grade, filters and effects and because they all share the same memory it maximises efficiency but there is a limit with the current GPUs. Puget Systems run an excellent Resolve Benchmark system and I bet the Mac Studio and future Mac Pro will be heading those benchmarks in a very short period of time. Only the very highest spec PC which is consuming much more power will come close. If all I did was video work then a Mac Studio would be ideal and I'd be back on Mac fulltime.

    Right now it pays not to be blinkered and use the best hardware for the job. For most 3D work use a PC (Linux if possible) otherwise get a Mac for everything else.
    edited March 2022 cgWerksfastasleepargonaut
  • Reply 146 of 155
    melgrossmelgross Posts: 33,510member
    melgross said:

    Your points didn’t go over my head, most are just wrong. Video does use as many CPU cores as there are available. That why there as so many high core count CPU’s in the first place, other that for servers. There are video tasks that go through the encoders/decoders, and the rest of the chip as well. 

    OK so you are just clueless.
    No, you’re just ignorant.
  • Reply 147 of 155
    melgrossmelgross Posts: 33,510member
    cgWerks said:
    melgross said:
    Your points didn’t go over my head, most are just wrong. Video does use as many CPU cores as there are available. That why there as so many high core count CPU’s in the first place, other that for servers. There are video tasks that go through the encoders/decoders, and the rest of the chip as well. 

    As far as engineering goes, you’re wrong there as well. NASA for example is mostly Mac, and has been for many years. Drug companies use Macs for their R&D programs. There are number OSs other examples. As Apple’s performance has v]come back up, we’ve been seeing more CAD/CAM appear, particularly in architecture. I’ve been using Bently CAM for many years, and now use Fusion 360.
    I think you guys are talking about different kinds of rendering.

    Anyway, yeah, the Vectorworks stuff looks impressive. If they can do it, then others should be able to as well. While pretty tight-lipped, the bit of interaction I've had with the devs of the upcoming Electric Image update, they seem impressed with the new Macs.

    The problem I see, is that many of the top CAD apps are pretty archaic, actually, and I just don't know if they'll get re-written for Apple silicon (Revit is a perfect example). Unless stuff like it gets running really well under emulation (which is maybe possible), Mac users will now be forced to decide if they can move to a different package (ex: Vectorworks), or have to buy an additional PC, move to a PC, etc. Whereas they weren't forced there when Apple was on Intel.
    Fusion 360 has been on the iPad now for several years. It began simply, but it now a orrery sophisticated app. It’s being, or already has been ported to the M series Mac. Cad developers have been stating the they are very interesting in the new machines. Go to the site Architosh.
    watto_cobra
  • Reply 148 of 155
    cgWerkscgWerks Posts: 2,952member
    @cgWerks ;

    Forgive me for not using quotes.

    For quite a large chunk of my career 3d rendering was always done on the CPU, I used to have a room stacked with cheap PC render slaves as a render farm to get stuff done. Most 3d artists had the same. Somewhere around 2014 - 2015 there was a buzz around a new renderer called Octane and it ran on the GPU rather than the CPU. It revolutionised 3d for us free lancers because we could replace a stack of PCs with a PC stuffed with GPUs instead. It was transformative and many of my fellow C4D users leapt on the bandwagon. GPUs are highly parallel devices which are good at doing lots of simple tasks together so it was a great fit for 3d rendering and proved an order of magnitude better than CPU rendering. The only downside all the models and textures had to fit in VRAM so there were scene complexity limitations. But these limitations didn't really affect most freelancers as our scenes are typically much less complex than studio productions. High end productions are typically still rendered on the CPU on farm because of RAM limitations. Hollywood scenes are hundreds of GB in size and way beyond the size able to be rendered on a GPU.

    I guess there's two types of GPU rendering to be aware of too, there's the real-time viewport/game engine style of rendering which is becoming more and more popular in Archviz rendering then there's the full photorealistic final frame rendering from Octane, Redshift and Cycles etc that calculates global illumination, accurate subsurface scattering and physically accurate materials etc. Both of these types of GPU rendering have their place and are set on a collision course a few years into the future as hardware gets more powerful.

    The M1 series GPUs don't have ray tracing hardware which is a bit of a disappointment TBH as it would improve performance dramatically. Architectural renderers like Twinmotion even without the ray tracing features of the PC version you can still produce architectural renderings of exceptional quality especially if you're going for a non-photoreal look which is popular in the Archviz game. This could be an area where the M1's large unified memory is a USP as it can hold scenes of much greater complexity than a typical consumer level PC GPU. You have to buy a Quadro to get 48 GB VRAM and connect two together for 96GB which is a very expensive alternative. I think the M1/2/3 Macs will find a place in 3d but not for photorealistic rendering with Octane and Redshift et al at least not for a while.

    Another area where the M series should excel is the real-time performance visuals sector, Macs were the hardware of choice at one stage due to the popularity of Quartz Composer, it was the core tech of a lot of software. Apple allowed Quartz to die and the market moved to PC but now I think Apple Silicon Macs could see this area grow again. That large unified memory, hardware codecs could revolutionise performance visuals or at least allow performers to use a tiny Studio box instead of a massive PC rig. I can't wait to see what the software developers do with Apple Silicon Macs in this area.

    Apple is behind on raw GPU performance and GPU features like ray tracing functions. It may come down the line but right now if you need brute strength then nVidia GPUs are way ahead. The excellent design of the M1 has allowed the flyweight GPUs to hit like a Middleweight in certain situations. There's no reason to rule out Apple punching even further above their weight with future hardware revisions. But definitely more GPU cores over CPU cores are needed for that, there's very few that need more than 10 cores. 20 and 40 CPU cores is a waste of silicon budget for me.

    What Apple has done exceptionally well is to make the M series the absolute king of video editing. The codecs on the SOC mean that the CPU doesn't get involved in decompressing the footage, the CPUs can concentrate on house keeping while the GPUs compute the colour grade, filters and effects and because they all share the same memory it maximises efficiency but there is a limit with the current GPUs. Puget Systems run an excellent Resolve Benchmark system and I bet the Mac Studio and future Mac Pro will be heading those benchmarks in a very short period of time. Only the very highest spec PC which is consuming much more power will come close. If all I did was video work then a Mac Studio would be ideal and I'd be back on Mac fulltime.

    Right now it pays not to be blinkered and use the best hardware for the job. For most 3D work use a PC (Linux if possible) otherwise get a Mac for everything else.
    Thanks for the in-depth background and information!

    Yeah, when I did a lot of CAD/3D work in the past (I spent over a half-decade heavily into it, then more as a hobby since), I was using Electric Image Animation System (dating myself there! I used to hang in forums with Alex Lindsay, people from ILM, people working on the Matrix, etc.) and a Spatial ASIS-based 3D CAD app (Vellum Solids -> Ashlar Cobalt -> Concepts Unlimited -> ViaCAD now. Same product.) I was doing mostly architectural design (in a mechanic sense, though!) and visualization.

    My scenes were HUGE and complex. I modeled every detail, so it wasn't like game design or even a lot of 'illustrative' architectural stuff. I was dealing with many millions of polygons (once tessellated for rendering) even back in the late 90s. EIAS was one of a few apps that could even handle that very well.

    I more recently learned Revit to get back into the industry, but who knows what I'll do on my own, as I want to get more into architectural visualization. BTW, EIAS has an update in the works, so I'm quite interested to see what they do. I'm probably going to start playing with Blender, as it has gotten quite advanced. I'm really interested in the huge number of renderers out there these days. Making a choice must be tough!

    But, from your post, it looks like how well the new Macs might do will depend a lot on which type of 3D you do, not just that you do 3D. I wouldn't have considered a lot of that, so thanks much! Great points about how all that shared RAM might be used. I was looking and even the 3080/3090 are in that ballpark of 800 GB/s (a bit above and below, respectively). If the speed is the same, then it shouldn't matter much if it is dedicated or shared RAM, I'd think. It just provides greater flexibility.

    Yeah, for video production, it looks like Apple will own that. And, yeah, at this point I just run one machine, which will be a Mac. But, if I start making money on my own with 3D work, then I'll pick the most money-making machine for that work (and probably use a Mac for everything else).
    watto_cobraargonaut
  • Reply 149 of 155
    cgWerkscgWerks Posts: 2,952member
    melgross said:
    Fusion 360 has been on the iPad now for several years. It began simply, but it now a orrery sophisticated app. It’s being, or already has been ported to the M series Mac. Cad developers have been stating the they are very interesting in the new machines. Go to the site Architosh.
    Oh, that's interesting re: Fusion 360. I used to follow Architosh closely. I need to start doing that again.
    I'm really hoping Spatial gets ASIS running native, as that is the engine of the CAD app I love. I think I saw a post that it runs on the M1 on Rosetta2 though, so at least it isn't impossible to run until they do.

    Yeah, I'm hoping it is just software. As, you point out, some seem quite impressed. Others, workflows seem unworkable (I saw a more organic-modeling attempt on a YouTube video on a MBP Max 32-core that was not useable. Can't remember the app now.)
    watto_cobra
  • Reply 150 of 155
    cgWerks said:

    Thanks for the in-depth background and information!

    Yeah, when I did a lot of CAD/3D work in the past (I spent over a half-decade heavily into it, then more as a hobby since), I was using Electric Image Animation System (dating myself there! I used to hang in forums with Alex Lindsay, people from ILM, people working on the Matrix, etc.) and a Spatial ASIS-based 3D CAD app (Vellum Solids -> Ashlar Cobalt -> Concepts Unlimited -> ViaCAD now. Same product.) I was doing mostly architectural design (in a mechanic sense, though!) and visualization.

    My scenes were HUGE and complex. I modeled every detail, so it wasn't like game design or even a lot of 'illustrative' architectural stuff. I was dealing with many millions of polygons (once tessellated for rendering) even back in the late 90s. EIAS was one of a few apps that could even handle that very well.

    I more recently learned Revit to get back into the industry, but who knows what I'll do on my own, as I want to get more into architectural visualization. BTW, EIAS has an update in the works, so I'm quite interested to see what they do. I'm probably going to start playing with Blender, as it has gotten quite advanced. I'm really interested in the huge number of renderers out there these days. Making a choice must be tough!

    But, from your post, it looks like how well the new Macs might do will depend a lot on which type of 3D you do, not just that you do 3D. I wouldn't have considered a lot of that, so thanks much! Great points about how all that shared RAM might be used. I was looking and even the 3080/3090 are in that ballpark of 800 GB/s (a bit above and below, respectively). If the speed is the same, then it shouldn't matter much if it is dedicated or shared RAM, I'd think. It just provides greater flexibility.

    Yeah, for video production, it looks like Apple will own that. And, yeah, at this point I just run one machine, which will be a Mac. But, if I start making money on my own with 3D work, then I'll pick the most money-making machine for that work (and probably use a Mac for everything else).
    Well I started out on Imagine 3d on the Amiga then Lightwave, then C4D. I now use Houdini and a bit of Blender. So who's dating who?

    Apple has a huge team of in house developers who have assisted greatly in porting Redshift, Octane and now Blender's Cycles to Metal. I saw yesterday that Apple is committing huge amounts of development to Blender, they are writing a new Metal backend so Blender runs right on top of Metal for the viewport and EEVEE which is Blender's real-time renderer, Cycles is Blender's photorealistic renderer. I also saw that one of the Apple developers say they expect to get more performance with Cycles in the coming months are they begin to lean more on the M1 architecture. So even for Apple it's an iterative process to getting the best out of the new architecture. BTW, I highly recommend Blender it has blossomed into an incredible App and Apple is playing a big part in its development. Blender is going places and it pays to know it as so many more studios are using it now. 

    It's still very early in the development cycle for M1 compatible Apps and developers are still finding their feet but it won't take long before everything is on M1 and highly optimised.

    As you're interested in Archviz, I would check out Twinmotion, Enscape also Unity and Unreal Engine as I know for a fact many of the biggest architect studios in the world are moving towards real-time rendering solutions because they fit so well for highly stylised and clean renders but you can also iterate so much quicker. The Mac Studio can access a huge amount of VRAM and if you make use of that in your Archviz work it could be quite the USP. To get an nVidia GPU with similar memory you'd have to spend more than a Mac Studio just for the GPU! There's definitely areas of opportunity for Apple Silicon Macs in 3D.

    The video editing market is now completely owned by Apple, they own it so much that a Mac mini performs better than a lot of high end PCs, that's not hyperbole, that's fact!


    watto_cobraargonautcgWerksfastasleep
  • Reply 151 of 155
    melgrossmelgross Posts: 33,510member
    cgWerks said:
    melgross said:
    Fusion 360 has been on the iPad now for several years. It began simply, but it now a orrery sophisticated app. It’s being, or already has been ported to the M series Mac. Cad developers have been stating the they are very interesting in the new machines. Go to the site Architosh.
    Oh, that's interesting re: Fusion 360. I used to follow Architosh closely. I need to start doing that again.
    I'm really hoping Spatial gets ASIS running native, as that is the engine of the CAD app I love. I think I saw a post that it runs on the M1 on Rosetta2 though, so at least it isn't impossible to run until they do.

    Yeah, I'm hoping it is just software. As, you point out, some seem quite impressed. Others, workflows seem unworkable (I saw a more organic-modeling attempt on a YouTube video on a MBP Max 32-core that was not useable. Can't remember the app now.)
    Im pretty sure the 16” Macbook Pro perked up a number of ears in the industry. But the new Studio will certainly arouse more than just ears! It’s power that CAD companies have complained about. Now they have it. The Mac is actually very popular in Europe for CAD and particularly architectural work. 
  • Reply 152 of 155
    melgross said:

    The Mac is actually very popular in Europe for CAD and particularly architectural work. 
    Says who? Define 'popular' because that couldn't be further from my 25+ yr experience.

    I've never seen a Mac running CAD and I know one person doing Archviz on a Mac that's because he was using C4D and was seen as an oddball. Most Archviz at the Architect studio level is done with Max+VRay. I'm not saying no one uses a Mac for CAD or architecture work but Macs are far from being popular in these markets.
  • Reply 153 of 155
    cgWerkscgWerks Posts: 2,952member
    UrbaneLegend said:

    Well I started out on Imagine 3d on the Amiga then Lightwave, then C4D. I now use Houdini and a bit of Blender. So who's dating who?

    Apple has a huge team of in house developers who have assisted greatly in porting Redshift, Octane and now Blender's Cycles to Metal. ... BTW, I highly recommend Blender it has blossomed into an incredible App and Apple is playing a big part in its development. Blender is going places and it pays to know it as so many more studios are using it now. 

    ... As you're interested in Archviz, I would check out Twinmotion, Enscape also Unity and Unreal Engine as I know for a fact many of the biggest architect studios in the world are moving towards real-time rendering solutions because they fit so well for highly stylised and clean renders but you can also iterate so much quicker. The Mac Studio can access a huge amount of VRAM and if you make use of that in your Archviz work it could be quite the USP. To get an nVidia GPU with similar memory you'd have to spend more than a Mac Studio just for the GPU! There's definitely areas of opportunity for Apple Silicon Macs in 3D. ...
    Heh, a bit more history, if interested ...
    ~~~
    Yeah, my first computer was an Atari 600XL with a cassette 'drive' (go out and play basketball while your app/game was loading up!). I learned years earlier on TRS-80s and was going to get the CoCo, but a cassette of apps a friend had given me got stolen, and a local store had a 'going out of business sale' on the Atari stuff. I con't think I had any CAD on there, but some art apps. I upgraded to a 1040ST because it did have some DTP/CAD/Illustration apps, and that's where I also got my first taste of crude 3D stuff and was hooked. (I just didn't get to professionally use any of it for years.)

    When I did, the ID firm I was working with was sub-contracting with an Architectural firm using AutoCAD/3DS Max. It quickly became apparent that how we were designing/building our industrial housing project wasn't going to work with that level of software tech. I went on the hunt for something that could match the workflow, and found Tim Olsen *just* starting 3D solids into Ashlar's CAD apps (another ID firm - Fitch Richardson Smith, a client of mine from  IT consulting - had introduced me to Ashlar Vellum, and I'd been impressed by the UI). I called Ashlar and explained our needs. The sales rep (bless him!) had said that the current product couldn't do what we needed (it was just 'simulated' 3D... line-drawing in 3D space, but everything had to be manually drafted, etc.), but did I have a few minutes to talk to one of the devs? I sure did, and a bit later, I got a call from Tim Olsen, who graciously spent like an hour or more on the phone with me trying to understand our needs. It was a fit, so we bought the product, and I started using it even before it was officially released to the beta team a couple weeks later! ViaCAD/SharkCAD is the current product-line, and Tim is still heading it.

    One of the architects at the above firm was quite interested in what I was doing on the CAD front, and was showing me 3DS Max. He was frustrated by some aspects of it (early days), and was wondering about EIAS. He had one of their slick marketing kits in his desk and a demo tape we watched. I was pretty blown away, and we talked the owner of the ID firm into investing in it (if I recall, it was something like $6500.) I ended up just diving into those two apps for several years and doing some revolutionary work at the time. We had potential investors coming in from all over and they couldn't believe what we were doing in terms of CAD/3D. Unfortunately, as things sometimes go in the real-world of venture capital, owners with dreams/egos, etc. I couldn't come to a longer-term employment contract and the project eventually went under (a few years after I left). Really sad, as it was truly a game-changer in building technology, which I've only yet seem partial close concepts in the building industry. But, my dream-career kind of vaporized and I went back to IT. I'm now attempting to make a jump, once again, back into something I love... but it hasn't been an easy road!
    ~~~

    I'm really glad to hear Apple is getting involved. I wonder if it is just that line of technology, or a bunch of other companies? I know Vectorworks is doing quite well, and being prominently featured. I'd so love to see Apple really take that industry by storm... not just because I'm an Apple fan, but because so much of the software/tech is just awful. I learned Revit to get back into the area (as I was told that was the package to learn), and as powerful as it is... it was a painful experience. Horrid UI in just about every regard. But, that's the current industry standard (at least among firms who have moved on from AutoCAD).

    Yes, thanks for the Blender -> Unreal Engine list. I have those apps/renderers marked out to look into. So much to learn! Absolutely, learning Blender won't hurt even if I ultimately end up using another app for one reason or another. The big question these days seems to be whether one can stay in-app, using default/plugin- renderers, or whether to use a dedicated 3D/rendering pipeline apart from the CAD app. Much tougher decision than it used to be, anyway.

    Thanks for the take on the VRAM and Apple Silicon. Until you pointed that out, I hadn't appreciated that nuance about how particular 3D workflows would benefit from that vs AMD/nVIdia cards. That seems to be a key point readers of this thread interested in 3D/CAD should pick up on!
    UrbaneLegend
  • Reply 154 of 155
    That's great, it's always nice to read someone else's journey.

    The CAD/BIM world is one that has avoided me over the years and I know very little about other than the market is dominated by Autodesk products in the UK. I've done my fair share of animations for architectural projects, not Archviz per se, but motion graphics for investors or planning committee approvals etc but zero experience in the nuts and bolts of doing the CAD. I wish you all the best in getting back into the 3D world and if there's any questions you think I can answer regarding rendering etc you're welcome to PM me here.


    cgWerks
  • Reply 155 of 155
    cgWerkscgWerks Posts: 2,952member
    That's great, it's always nice to read someone else's journey.

    The CAD/BIM world is one that has avoided me over the years and I know very little about other than the market is dominated by Autodesk products in the UK. I've done my fair share of animations for architectural projects, not Archviz per se, but motion graphics for investors or planning committee approvals etc but zero experience in the nuts and bolts of doing the CAD. I wish you all the best in getting back into the 3D world and if there's any questions you think I can answer regarding rendering etc you're welcome to PM me here.
    Heh, it's evading me at the moment as well, but hopefully not for long. I'm just hoping enough smaller firms pick it up (Revit, Vectorworks, etc.) that I can land a job at one of those, and not end up in some mega-corp documenting electrical outlets in skyscrapers all day.

    Thanks!
Sign In or Register to comment.