melgross
About
- Username
- melgross
- Joined
- Visits
- 127
- Last Active
- Roles
- member
- Points
- 10,981
- Badges
- 2
- Posts
- 33,729
Reactions
-
AMD to unveil Radeon RX 6000 GPU family on Oct. 28
tht said:melgross said:The latest Nvidia GPUs have over 25 billion transistors, and an energy budget to support that. There is simply no way Apple can even begin to approach that. It takes Nvidia’s highest GPU boards to enable ray tracing in a useful way. Remember we’re talking not about ray trace in static imagery, but in dynamically rendered images at a rate of at least 60fps. That’s real time tracing. It’s in addition to every other graphic demand, such as real time physics, dynamic texture generation, hidden image elimination, etc. these all require very large amounts of graphics RAM additionally.
Transistor budgets of over 25 billion are not that daunting for Apple Silicon. It's the power budgets that are. They just aren't going to run any of their SoCs at over 150 Watts, and I'd bet they want 100 W or lower. That will be self-limiting, so performance probably won't match these 300 W dGPUs or 200 W CPUs on average.
I expect them to be at parity or possibly more performant than dGPUs at each Watt class though. The fab advantage will give them about 30% perf/W advantage. The density advantage is going to be about 100%. They have so many transistors to play with on TSMC N5 that you wonder if they are willing use it all.melgross said:but none of us know what Apple’s plans really are. We can speculate, but that’s about it. And Apple has told us the basics during the June conference. In fact, they told us more than I expected. I listened very closely. I looked at their charts. We don’t see a separate GPU. At least not for the recognizable future. I suspect that we’ll se an SoC that competes with Intel very well in the beginning, likely exceeding performance at the level of Apple’s similar machines previously. I don’t expect to see overwhelming GPU performance. I would to see performance of a lower level separate GPU.
‘’anything that exceeds that would thrill me.
I really don’t care how efficient ARM instructions are, or how good Apple’s engineering is. There are physical laws that everyone has to follow, and it’s not likely that Apple is special here. Not only is Apple nor go8ng to run over 150 watts, likely they don’t want to run much over 20 watts. There’s only so much they can do. -
AMD to unveil Radeon RX 6000 GPU family on Oct. 28
tht said:melgross said:tht said:melgross said:tht said:I'm betting that the A14 GPUs will have hardware support for raytracing. So, the iPhone will be the first Apple device to have hardware raytracing support, not Macs with AMD GPUs.
Anyways, anyone want to clarify AMD GPU codenames? Both the chip and graphics API support? Getting confusing out there. Navi, Navi 14, Navi 12, RDNA, RDNA2, so on and so forth.
‘how Apple could duplicate that in an IG graphics SoC is something that I doubt right now, even assuming it’s something they’re looking at. Ray tracing is one of the most difficult things to do in real-time. The number of calculations is immense. Apple also uses a small amount of shared RAM, that’s not considered to be the best way to power graphics hardware, so we [will] see.
As far as iGPUs versus dGPUs, our common distinctions for the difference between the two will become or are gradually becoming more and less distinct. If you look at the Xbox X series SoC, it's basically a 12 TFLOP GPU with a vestigial CPU (an 8 core Zen 2) in terms of the amount of area the GPU and CPU occupy. And this is fabbed on TSMC 7nm. TSMC 5nm yields 70% more transistors. So, I can see a TSMC 5nm SoC with raytracing hardware in it. Intel with Tiger Lake is getting closer to this with its iGPU taking about 40% of the chip area. This will drive dGPUs to higher performance niches, and commensurate higher Watt tiers. It's going to be interesting to see how the dGPU market shakes out. Maybe, server GPUs will be the vast majority sales of dGPUs in a few years as iGPUs start to take even more of the mid range of the gaming market for GPUs.
It's still a very big question on how much power Apple is willing to burn on the GPU and how they get more perf/Watt out of the GPU than competitors. The obvious way is to run the iGPU at low clocks, have a lot of GPU cores, and have a high bandwidth memory subsystem to feed it. This will require a lot of chip area and transistors, which is something Apple may have the luxury for, as they don't have to have 50% margin on their chips, and they are on the densest fab.
while Apple’s built-in GPU will compete well against Intel’s new IG, which is twice as powerful as the current, to be replaced next month, generation, and possibly low end GPUs boards, and possibly even some low mid range boards, none of that can do ray trace, or other really high level computations. If Apple manages to come out with a separate GPU, then things could be somewhat different. But Apple has stated that one reason their GPU will be competitive is because of their “unique” admixture of these other SoC elements, and the software tying it all together. I’ve stated that possibly, if Apple can figure out how to put those elements on a separate GPU, and give that GPU sufficient bandwidth with proper graphics memory, and a lot of it, with a decent power budget, they could really have something.
but none of us know what Apple’s plans really are. We can speculate, but that’s about it. And Apple has told us the basics during the June conference. In fact, they told us more than I expected. I listened very closely. I looked at their charts. We don’t see a separate GPU. At least not for the recognizable future. I suspect that we’ll se an SoC that competes with Intel very well in the beginning, likely exceeding performance at the level of Apple’s similar machines previously. I don’t expect to see overwhelming GPU performance. I would to see performance of a lower level separate GPU.
‘’anything that exceeds that would thrill me. -
AMD to unveil Radeon RX 6000 GPU family on Oct. 28
tht said:I'm betting that the A14 GPUs will have hardware support for raytracing. So, the iPhone will be the first Apple device to have hardware raytracing support, not Macs with AMD GPUs.
Anyways, anyone want to clarify AMD GPU codenames? Both the chip and graphics API support? Getting confusing out there. Navi, Navi 14, Navi 12, RDNA, RDNA2, so on and so forth.
‘how Apple could duplicate that in an IG graphics SoC is something that I doubt right now, even assuming it’s something they’re looking at. Ray tracing is one of the most difficult things to do in real-time. The number of calculations is immense. Apple also uses a small amount of shared RAM, that’s not considered to be the best way to power graphics hardware, so we
ll see. -
Netgear Orbi Pro WiFi 6 Tri-band Mesh System brings reliable WiFi to your small business
tenthousandthings said:Does this usually mean the home versions will get an update soon?
Our AirPort Extreme home network is due for an update — the lack of mesh capabilities has caused a few problems lately.Any recommendations? eero vs. Velop vs. Orbi?
it’s why I haven’t yet moved to anything else. Why I have so many? My house was built in 1925. The old methods resulted in interior brick walls with wood lathe with steel expanded metal mesh over all walls and ceilings. Over that is 3/4” mortar, 1/4” plaster, with many coats of paint, among which is lead based for the early layers. Not dangerous, by the way, unless that gets exposed and peels.In other words, much of my home is a faraday cage. Apple’s routers work very well, giving me from 375 to 550 Mb/s everywhere, though lower in the basement. That’s better than the oerformance of most mesh routers currently being sold. If it weren’t for WiFi 6, I might be willing to wait another couple of years. I’ve been looking at this new Netgear model and have been thinking about trying it. But I want to learn a bit more about how strong the signal really is. -
Apple isn't getting $454M back from VirnetX because it waited too long to ask
mordac_the_preventer said:If this stands, it means that in future, anyone found to infringe will refuse to pay up until all possible avenues for appeal are exhausted.
So any legitimate patent holders will have to wait essentially forever for any royalties/damages.