SkylightActive

About

Username
SkylightActive
Joined
Visits
5
Last Active
Roles
member
Points
30
Badges
0
Posts
10
  • Apple Silicon Mac mini dev kit looks like a desktop iPad Pro

    The GPU side of things
    Companies usually license each other's IP's. Apple did with Imagination for their GPU chip design, until 2017 when they terminated it. Back then, we had years of laps where APPLE thought that they could design better GPU them selves. Not needing PowerVR for their GPU. If you think about it, its pretty bold. A company that doesn't do GPU, takes GPU chip design all in-house. The fruits bared from that decision is evident them selves on their year-year GPU performance gains. 

    Funny enough, recently they reactivated a new license agreement with ImgTec. That new license agreement is reported to be for the RAY TRACING chip design "optimized for Mobile" IP's. ImgTec's way of doing Ray Tracing is to offer Ray tracing capability that uses the least power as much as possible. Which aligns with Apple's move to ARM for the Mac. Performance per Watt. 

    Nvidia and AMD both have their roadmaps regarding beefy GPU with Ray tracing. Ampere and RDNA2. Both aimed at tech like DLSS which will help with efficiency in Ray calculations using AI without having to keep adding hardware to scale calculations. Max power for max performance. These are things that Apple have to either compete with or simply just license them and add onto their chip. But looking at how things are going, I don't think that's going to happen. Apple don't want power-hungry chips. Its going to be a different approach. x86 based CPU/GPU is power driven. Apple's ARM is least power but more performance. I foresee Apple taking the GPU in-house fully. They clearly were focused on Machine learning, I think they have something like DLSS that we don't know about (Although kinda had a hint in Developer's Video in WWDC19 when they talk about Raytracing on Metal API). If Ray tracing and brute force GPU is name of the game in 2020, then Apple have a lot to do to match or exceed in what's about to come from AMD and Nvidia. I foresee Apple taking the same path as Xbox Series X CPU/GPU combo solution for the bigger MacBook Pro. I don't know much about CHIP fabrication or engineering limits, but to place say, 40-50 GPU compute units onto the same die isn't too far fetched. If Microsoft and Sony are using custom APU chips with setups like that, I don't see why Apple won't. It just that it will be their own architecture instead of say AMD's. Forget Nvidia. That relationship is long gone.

    Either way, Apple knows that the 16" MacBook Pro has to perform. Giving customers underpar GPU solution is a death sentence which I don't think they are willing to risk. The pattern here is Apple realizes that they can't keep pissing PROSUMERS off. Like the 15" MacBook pro thermal throttling, the keyboard, the screen. etc. I think they know what's at stake. The birth of the 16" MacBook Pro is the prove of that. When you ask customers to join a complete brain transplant with you, you need to have something to back it up. Particularly the PRO's. They are hard to convince. SO they gotta to deliver. This mysterious chip is so tightly guarded, we all are scrambling for any juicy details possible. I for one, believes in Srouji and is positive to what he will bring to the table. 

    What could happen is something that maybe too out there (my opinion). Apple could increase the SOC die and house all the guts (on their WWDC 2020 slide) with massive CPU and GPU and memory. But by doing so, created many space available for other things on the logic board like..... "COOLING". If ARM is so good at thermal efficiency, then fitting everything inside the SoC and retain the fans and cooling could allow the Mac Laptop to really go beyond. A laptop with the power of a desktop. A sales pitch that they could really gloat. We saw some evidence of the 50Watt envelope for the AMD Radeon 5600m GPU on the 16" MacBook Pro. Max the compute units, but keep everything cool. And it performs excellent from what I can see. The fact that Apple asked AMD to custom build this chip, kinda gives a glimpse of what's about to come I believe. Nvidia and AMD has to create a different GPU die, simply because they don't make the CPU's them selves (Apart from AMD APU's). Or that Intel blocked them. 

    Apple stands at a different position to bridge what AMD and Nvidia couldn't. We always have the belief that a dedicated GPU will always be better than an integrated GPU. But what about if we are witnessing an industry transition? Where instead of having the oh so many components spread out across the entire motherboard replaced by a single large chip. You remove latency. The very first guy who thought of the first Silicon probably had the same idea and people probably thought he was nuts. On a grand scale of things, I believe that this is going to change how motherboards are designed and built. SoC's have existed for sometime. But to push it at the forefront of mainstream market is something rather exciting. These are all speculations and personal predictions obviously. But it is worth thinking about. 
    patchythepirateDan_DilgerfastasleepGG1mattinozMplsPwatto_cobra