tht
About
- Username
- tht
- Joined
- Visits
- 196
- Last Active
- Roles
- member
- Points
- 8,042
- Badges
- 1
- Posts
- 6,033
Reactions
-
New Mac Pro may not support PCI-E GPUs
mjtomlin said:1. If these new Mac Pros do not have support for all the PCIe cards that the current Intel Mac Pro support, this system will fail. Most users interested in this system will be those looking to upgrade from their current Mac Pro and will want to bring their extremely expensive MPX modules (GPU cards) with them. The advantages of PCI slots isn't just expandability, it's also portability - moving those cards to another system.
2. RAM is not on the SoC, it's on-package and can easily moved to the motherboard. There's no reason Apple cannot do this - yes, there will be a performance hit.
Now, they aren't holding up their end of the bargain by not offering Apple Silicon GPUs competitive to AMD and Nvidia graphics cards, but they surely know that. Sounds like it hasn't gotten bad enough for them to reverse the decision. The decision surely has huge ramifications for macOS design too.
For other PCIe cards like IO cards, storage, audio, and whatnot, I bet they will be supported if the Mac Pro has PCIe slots, which I think it will.
2. You literally stated a reason for Apple not to do it. The GPU is going to have a performance hit with less memory bandwidth.Before Apple started showing their architecture, a lot of people were contemplating how they were going to have system memory feed the GPU. 4 to 8 stacks of HBM2e? Lots of GDDR memory? Their own memory stacking solution (memory cubes et al)? 8 to 12 DDR5 channels? Turns out they decided on a gazillion channels of commodity LPDDR. Perhaps their GPU scaling issues is really a latency issue with LPDDR, and they really need to have a high clock memory solution (GDDR) to fix it? I don't know.
-
New Mac Pro may not support PCI-E GPUs
cgWerks said:tht said:
… The big issue, how do they get a class competitive GPU? Will be interesting to see what they do. The current GPU performance in the M2 Max is intriguing and if they can get 170k GB5 Metal scores with an M2 Ultra, that's probably enough. But, it's probably going to be something like 130k GB5 Metal. Perhaps they will crank the clocks up to ensure it is more performant than the Radeon Pro 6900X …
I suppose if they keep scaling everything up, they’ll kind of get there for the most part. But, remember the previous Mac Pro could have 4x or more of those fast GPUs. Most people don’t need that, so maybe they have no intention of going back there again. But, I hope they have some plan to be realistically competitive with more common mid-to-high end PCs with single GPUs. If they can’t even pull that off, they may as well just throw in the towel and abandon GPU-dependant professional markets.
The GPU team inside Apple is not doing a good job with their predictions of performance. They have done a great job at the smartphone, tablet and even laptop level, but getting the GPU architecture to scale to desktops and workstations has been a failure. Apple was convinced that the Ultra and Extreme models would provide competitive GPU performance. This type of decision isn't based on some GPU lead blustering that this architecture would work. It should have been based on modeled chip simulations showing that it would work and what potential it would have. After that, a multi-billion decision would be made. So, something is up in the GPU architecture team inside Apple imo. Hopefully they will recover and fix the scaling by the time the M3 architecture ships. The M2 versions has improved GPU core scaling efficiency, but not quite enough to make a 144 GPU core model worthwhile, if the rumors of the Extreme model being canceled are true (I really hope not).
If the GPU scaling for the M1 Ultra was say 75% efficient, it would have scored about 125k in GB5 Metal. About the performance of a Radeon Pro 6800. An Extreme version with 128 GPU cores at 60% efficiency would be 200k in GB5 Metal. That's Nvidia 3080 territory, maybe even 3090. Both would have been suitable for a Mac Pro, but alas no. The devil is in the details. The Apple Silicon GPU team fucked up imo. -
Intel just took the worst beating in earnings in over a decade
ITGUYINSD said:The article doesn't really explain WHY Intel sales were so bad. Especially during the holiday season. Is new gear too expensive for consumers? Are people keeping their computers longer? Or are consumers moving towards more mobile devices?
It doesn't help Intel that major sellers like Dell have raised their prices making one take a second thought about buying or upgrading.
MS reported a 40% decline in Windows revenue. PC sales declined liked 35 to 40%. Crypto boomed during the pandemic. Crypto is now busted, so all those server GPU sales are declining. Services (Google, Amazon, MS, etc) boomed during the pandemic - work from home services etc - and now it has busted too.
Apple will be lucky to hold the line at 20% declines imo. They said Macs would decline 3 months ago. iPhone sales have slowed due to COVID lockdowns at various plants. -
Apple still on track for iPad Pro revamp with OLED display in 2024
charlesn said:A revamp with OLED? Wait, what? All of the display coverage lately has been about the eventual move, when prices come down, from OLED to mini-LED in computer applications because of several advantages. But the iPad 12.9 inch already has a mini-LED display... so we're gonna go backwards and replace it with OLED? This makes no sense.
LCD = monolithic backlight (iPA, iPad, MBA, MBP13, iMac24, ASD27)
miniLED = discretized backlight (MBP14/16, iPP12.9, Pro Display XDR)
OLED = the subpixels are emissive and are the lights, and organic (iPhones, Watch)
microLED = the subpixels are emissive and are the lights, and inorganic
The miniLED in the iPP12.9 and MBP14/16 have backlights that are about 0.25" x 0.25". About 2500 of them. The 32" XDR has about 650 backlights, so much larger. It's a large display though and resultantly expensive. These miniLEDs trade having very high brightness - 1600 nits for HDR content - for some blooming in dark rooms. And they are presumably much longer lived.
The OLEDs trade having no blooming, but are limited by brightness. iPhones have 1200 nits of brightness for HDR content, which is pretty darn good, but they are more fragile than miniLED or LCDs. Using them for 10 years may be a problem due to the organic compounds deteriorating. You can use a computer monitor for 10 years. A phone not so much.
Apple has been trying to get Samsung and LG to make dual-layer OLED displays for Macs and iPads, and probably even external displays too. More robust, can be driven to higher brightness, and will have longer lifetimes. So probably good enough for iPads and Macbooks. These are probably what are driving these iPad and Mac OLED rumors. There's a price per OLED display that everyone is trying to meet, they just haven't got there yet. In the next couple of years probably.
For microLED, where it will be more robust due to using inorganic compounds for the emissive subpixel, it's going to be 4 to 5 years before they can make it affordable for 5 to 30 inch displays. An Apple Watch display in 2025? Maybe. 5 to 30 inch displays at 220 to 350 PPI? Probably another 5 years at least, and there's time for a 3 to 5 year cycle of using OLEDs or miniLEDs. -
M2 Pro & Max GPUs are fast -- but not faster than M1 Ultra
mjtomlin said:retrogusto said:I’ll be interested to see if anyone does benchmarks comparing the top two processors available in the new Mac Mini. I’m wondering if the fastest one will have any issues with throttling due to overheating in that small enclosure.
4 TB ports at 15 W. 2 USBA ports at 10 W. That's 80 W, with 105 W for everything else. 40 W for the CPU cluster and 60 W for the GPU cluster, give or take. I do wonder if it can actually deliver 15 W out of every TB port and 10 W out of the USBA ports, simultaneously.