aderutter

About

Username
aderutter
Joined
Visits
100
Last Active
Roles
member
Points
1,344
Badges
1
Posts
640
  • First Mac Studio M3 Ultra benchmarks significantly outpace the M2 Ultra

    The M3 range had a better jump up from the M2 range than the M4 had from the M3 for my type of workload, including 3d hardware raytracing - so it feels frustrating that the new Ultra is M3 based and thereby not as good as we hoped it would be, but I think the Ultra will be better than people expect, or rather the M4 Ultra would not have been that much better than the M3 Ultra.

    For my 3D work the GPU isn’t really important as I primarily use ZBrush nowadays and that is entirely CPU based. Even the Redshift renderer ZBrush comes with is CPU based (but I could choose to subscribe to the GPU version if I felt the need). More RAM and CPU is more important to me than more GPU for my 3D work, as I spend 99% of my time not rendering. 

    We’ll have to wait and see what the review tests show to get a better idea of real-world performance but anyone doing significant video work, especially those making use of the specific hardware encoders/decoders could see a big benefit from the Ultra - but they might not notice it much except for shorter rendering times if they are not doing massive projects in 8k.

    My other half is a graphic designer and she recently moved from a 2019 intel iMac to a 2024 M4 Pro Mac Mini with Studio Display (both setups cost about the same) and she doesn’t feel that it is that much better except the heat/fans and of course being able to run the latest versions of certain programs. 

    The horse train analogy reminds me of something I read about ZBrush cores from some time ago, whereby to determine when more cores is better is to multiple the core frequency with the number of cores and the bigger number wins…

    M3 Ultra = (4.05 x 20) + (2.75 x 8) = 103
    M4 Max = (4.5 x 10) + (2.95 x 4) = 56.8

    an M4 Ultra would have been: (4.5 x 20) + (2.95 x 8) = 113.6 

    … I just searched for the source and found this: “This means that a 6 core CPU at 2.5 GHz would have a score of 15 while a 4 core CPU at 3 GHz would only have a score of 12. The former edges out the latter even though each core is slower.” from https://www.reddit.com/r/ZBrush


    neoncatnubuswilliamlondonsurgefilterwatto_cobra
  • Giant foldable iPad with MacBook-like design rumored to arrive in 2028

    Wacom killer.
    9secondkox2watto_cobra
  • Generation gaps: How much faster Apple Silicon gets with each release

    If you look at Adobe apps for example, they benefit more from CPU as long as the GPU is at a certain level. Once a user has a mid-range GPU then they don’t need GPU as much as RAM & CPU (and the real-life RAM requirements have decreased with Apple Silicon). 

    Another example is ZBrush which is purely CPU based. Even most of the time working in other 3D applications the CPU is more imporant as people working in 3D spend more time not-rendering than rendering and the machine can render while you put the kettle on. 

    It’s gamers and some 3d renderers that use more GPU - but CPU 3D Rendering is more accurate and so CPU rendering (obviously with farms) is the default in hollywood whilst us mortals have to just use what’s available on our budgets - typically a desktop GPU rather than a cloud render. The usual options when thinking only of rendering for games or lower-end 3D rendering are GPU (cheap and fast on PC), or CPU (slower, more accurate and slightly better on Mac generally).  

    When/if Apple release an M4 Ultra that is twice the performance M4 Max (GPU) it should be equivalent to an Nvidia 4090 and set the cat amongst the pigeons. 2025 could be the start of Apple desktop disruption.

    williamlondonAlex1Nnetroxapple4thewindanoxwatto_cobra
  • EU hits back at Apple withholding Apple Intelligence from the region

    To anyone sane it’s clear that the “gatekeeper” term is an abhorrent socialist fiction, akin to something that belongs in the annals of communism. Preventing companies making products as they see fit and letting the people/market sort the wheat from the chaff is itself the epitome of anti-choice/anti-commercialism.

    Microsoft could try to build a new smarterphone than iPhone or Pixel if they wanted to (they seem to be learning a little from copying Mac hardware so maybe this could rub-off in the mobile space too). 

    Apple could try to make a new social media network if they wanted to (it’s probably something they still think about given iMessage and the number of connected users they have).

    Meta could try to build a new online shopping experience to replace Amazon if they wanted to (it’s probably in their long term plans).

    Of course, a new upstart could invent a new paradigm relacing any or all of the above.

    If it happened it might be a chinese startup, you just know how the EU would react to this, especially if the EU companies couldn’t compete on price - one only has to look at the EU threatened actions against chinese electric vehicle companies.

    Vestager clearly studied the fascist Mussolini but veils herself politically trying to avoid people seeing her intended one-nation EU. No wonder the right is on the rise in the Europe. 

    Given that the EU thinks it’s okay to write vague laws and create traps for large companies to fall into so they can fine them, it’s no wonder a sensible company like Apple has to think long and hard before launching new products or services there. I don’t think we can avoid a two-tier tech world developing, with EU becoming second-class citizens by their leaders design. It’s almost as if history repeats itself. 
    williamlondonthtAzzwatto_cobra
  • An Apple Vision Pro successor may need to be tethered to an iPhone or Mac

    Many of us expected the AVP or AV to be tethered to an iPhone, iPad or Mac.

    It makes sense to use the computer we already have for most procesing, remove the Mx chip from the headset and put the rechargeable battery in the headset - cable needed for data connection / video bandwidth (think how SideCar with Mac/iPad works far better with a USB-C cable).

    I guess Apple wanted to get something out there earlier than the end of 2025 to show what they can do.


    VictorMortimer9secondkox2watto_cobra