- Last Active
elijahg said:I'm interested to see if Apple puts a real discrete GPU in the 27" iMac. If so it'll need a CPU with many more PCIe lanes than the M1 has. The 5K display needs some serious grunt when rendering full-resolution 3D, my 2020 iMac with a fairly pokey Vega Pro 48 struggles at times. A tiny low power integrated GPU has no chance.
So either they include a powerful discrete GPU or they create a powerful integrated GPU.
From a thermal design standpoint, having a discrete GPU is probably more likely. Perhaps the more interesting question is how they will handle the Neural Engine. Will they put that circuitry on its own chip?
One unknown is whether or not Apple would use a Neural Engine-only chip in their own server hardware; I wouldn't expect them to productize this. It would be used for their own cloud operations (Siri, image processing, video processing, whatever).
I would not be surprised if somewhere in a lab in Cupertino, there are prototype task-specific ASICs: 3D rasterization, AI, and signal processing (especially video encoding).
danox said:mcdave said:Microsoft just aren’t good at products, they’ve always lacked the cohesive design capability good products require. They’re the IT equivalent of meccano (we can’t make it do something useful but maybe you can).
However, it is my belief that Apple can move a substantial portion of its data center operations away from x64 architecture hardware and to their own custom silicon which is vastly superior in terms of performance-per-watt. This may include custom in-house SoCs that are largely CPU and machine learning cores (no graphics cores) and are optimized for specific tasks compared to a general purpose CPU from Intel or AMD.
Not marketing this special silicon gives Apple a competitive advantage in terms of operational efficiency.
If Apple could disrupt Microsoft in the server market, it would be by reducing their reliance on Microsoft server software (which I do not believe they have widely implemented) or Microsoft cloud services (also not believed to be widely used by Apple).
mobird said:What is to be accomplished by porting numbers out to another carrier from the standpoint of the attacker?
It is currently unclear if this is a ransomware attack and if the threat actors have demanded compensation from Mint Mobile. But this is one of the typical M.O.s of these type of cybercriminals.
blastdoor said:I really appreciate the sincerity of bill’s emails. It’s nice to see the genuine appreciation of a competitor’s capabilities.
Bill Gates has plenty of character flaws and weaknesses but for sure he was a ruthless businessman and pugnacious adversary.
By contrast, his hatchet man Steve Ballmer didn't take Apple seriously, squandered Microsoft's entire mobile presence and let Apple blow past them in terms of market valuation.
Ballmer was the right guy for a short period of time but couldn't successfully step into a larger leadership role. B-schools routinely use Microsoft as a case study of disastrous failure. Today Microsoft has re-emerged as a powerhouse but only from their enterprise/cloud computing efforts. The current Microsoft CEO can take most of this credit, certainly not Ballmer.
It wasn't predictable. That's why Microsoft was caught flat footed. Remember that Apple's market valuation was decidedly small compared to Microsoft, really a David vs. Goliath situation at the time.
Time and time again Jobs surprised. The iPod itself was widely doubted when it debuted (2002 I think); just do an Internet search for "cmdrtaco ipod" to see what a popular technologist thought.
Apple stunned again with the iPhone in 2007 and many predicted failure due to the lack of a physical keyboard. Remember that the RIM BlackBerry was the smartphone gold standard at the time and Windows mobile phones were still a significant player.
Apple again caught the industry off guard when it released its own silicon in the form of the A-series SoCs and a few years later left the entire semiconductor industry speechless when the A-series jumped to 64-bit architecture, years before it was expected to show up in a mainstream product.
Apple crushed it again with the iPad and then killed it with Apple Watch, the latter despite long-standing rumors that Apple had been testing "wearables" on its corporate campus for years. Remember that each time, Apple was not first to market MP3 players, online media stores, smartphones, tablets, smartwatches, custom ARM silicon.
About the only thing predictable from Apple in the past five years was the Apple Silicon Mac. Savvy industry watchers presumed that Apple had been running macOS on prototype ARM-powered Macs in their labs for years, maybe as early as that first 64-bit A7 SoC. There were hints all along: the deprecation of OpenGL, the end of support for 32-bit apps. The inclusion of specialized chips like the T2 Security Chip which was clearly an interim solution to be paired with Intel CPUs until Apple could ship their own SoC with that functionality built in.
One can see where this is headed for the Macs. macOS Monterey is leveraging the Neural Engine in the M1 SoC, the machine learning silicon. My guess is that the M2 and future designs will vastly improve on the Neural Engine's capabilities which will take on tasks that it is better suited for than the CPU cores: image recognition, text recognition, voice recognition, signal processing (both audio and video). As mentioned in the WWDC keynote, more machine learning tasks will be handled on device rather than being sent to Apple's servers.
Nvidia's GeForce RTX GPUs do audio and video processing with their Tensor cores (machine learning); if you have an GeForce 20 or 30 series graphics card, you can use the Nvidia Broadcast software to clean up audiocasting and videocasting.