Dan_Dilger

About

Username
Dan_Dilger
Joined
Visits
54
Last Active
Roles
member
Points
3,480
Badges
2
Posts
1,583
  • What the Apple Silicon M1 means for the future of Apple's Macs

    JWSC said:
    It will be interesting to see what kind of bill of material reduction Apple is seeing with so many circuit card components being replaced by one single chip.
    It sure would be interesting to see this, but nobody outside of the very internal core of Apple has any real numbers. The BOM that will inevitably get "estimated" will be total conjecture. Previous iPhone BOMs threw out figures like "$15 - ARM chip" that were total hogwash. 

    While Apple is certainly saving money over using multiple chips, or paying Intel and AMD a premium for their IP, I'm not sure that Apple is immediately saving huge sums building a some slice of their own processors in this first batch. 

    Apple has supported custom SoC development for iPads by selling massive volumes of them, while sharing a lot of the expense with even higher volumes of iPhones--inclding sharing most of the work to write/optimize iOS. For the M1, Apple had to do a LOT of very custom work unique to the Mac and macOS, which it will only initially ship across its entry notebooks and mini, perhaps half (?) of its total Mac shipments. Even if they might sell at a higher margin, the company sells a lot fewer Macs than iOS devices. That suggests it will take longer to amortize the initial costs of developing M1 and all the work in Big Sur and elsewhere. 

    However, that also means that as it expands to deliver an "M1X" for higher end MBPs and iMac, it will have done much of the foundational work already. Each new M generation should add to the cost savings and perhaps erase more of the expensive third party components in its Mac lineup (higher end Intel CPUs and AMD's discreet GPUs). 

    So while M silicon should help drive down Apple's costs over time, I don't think it is making vastly higher margins on these first models. And it can certainly afford that, of course. It's also useful to point to that very few other companies could assume the risk of taking on such a massive project with the hope of it paying off, and at the risk of Intel   or AMD pulling ahead and erasing the value of that work. Just assembling the silicon talent to start on M1 effectively took 12 years of making +$1Trillion on iOS shipments.  

    That's why these reports that announced that "Microsoft also has a custom ARM chip!" were so grossly misleading. 
    williamlondonScot1JWSCRayz2016cornchipdewmemuthuk_vanalingamwatto_cobrajony0
  • How Apple Silicon Macs can supercharge computing in the 2020s

    The article gets many things wrong.  The transition from 680x0 to PowerPC was a very good move.  The article implies that the switch to PowerPC was a bad move because no one else in the industry used PowerPC chips.  That had nothing to do with it.  The 680x0 architecture was going nowhere.  The 68060 to replace the 68040 was not much faster and required a lot of re-writes to take advantage of it.  The PowerPC, on the other hand, offered huge performance boosts that were much faster than Intel at the time.  Remember the famous snail ad?  That gave Apple a big boost, especially with the G3, G4, and G5.

    The shift to Intel had nothing to do with compatibility.  The PowerPC reached its limit.  The G5 ran too hot for any type of portable use, and it would actually run slower than a G4, if they managed to shoehorn it into a PowerBook chassis.  IBM also could not produce any faster G5 chips for the desktop.  Intel, on the other hand, had the performance per watt and that is what Apple was looking for.  The Core Duo chips were far superior, and made the MacBook Pro run 5x faster than the G4.  Remember, Apple could not make the products they wanted to make with the PowerPC roadmap.  Also, when Apple acquired NeXT, OpenStep was already x86 native.  All versions of OS X were 100% x86 native behind closed doors.  Apple knew the PowerPC was reaching its limit and had been planning for Intel years before the switch took place.  Boot Camp and running Windows natively was just an added bonus.

    The need to run Windows on Mac is still quite popular for running Windows as a VM on Mac, especially for developing software, so that might be a minor loss.  But Apple shifting to their own processors allows them to release new hardware on their schedule, and not be dependent on Intel.  Apple has done very well with the Intel Macs and it is amazing that the Intel Macs have outlasted both PowerPC and 680x0 Macs in longevity....at 14 years.
    Of course the transition to PPC was necessary, well intentioned and the best option at the time. Apple had few other viable alternatives. As I wrote earlier, Apple had attempted to develop its own silicon in the 80s and failed. It also worked with Acorn to develop a mobile chip (for Newton) that wouldn't have been powerful enough for the Mac. PPC was the right decision at the time.

    The comment that "the difficulty of that transition and its unexpected result might suggest that in hindsight, it was ultimately a mistake to have attempted such a complex and risky task" isn't saying it was a mistake, it's an acknowledgment that in hindsight, it could appear that it was a mistake because it didn't work out as expected. It then notes that despite this, all the work that delivered the PowerPC transition was later applied to the Intel transition. And of course, it also informed the work to develop iOS on ARM.

    As others have noted, NeXT's work in parallel to move its NeXTSTEP OS and tools to PPC also contributed to the knowledge and experience that made its way to Apple as it completed its PPC transition and then moved to Intel (leveraging NeXT's work in x86; Apple's own "StarTrek" x86 work hadn't resulted in a shippable product). 

    By 2005, PowerPC hadn't really "reached a limit." G5 was fast and achieved a clean, shipping 64-bit architecture well in advance of Intel. The problem wasn't architectural. IBM could have kept going, and other PPC partners could have done the work to develop a more mobile friendly, power effecient chip for notebooks. The real issue was that there simply wasn't any economies of scale or shipping volumes to justify either of those efforts.

    Intel meanwhile had acquired its new Core x86 architecture from research and development centered in Isreal. It's own Pentium 4 had indeed reached a dead end in performance per watt. When Apple saw what Intel had with Core, it realized it could get a CPU for its MacBooks and eventually gain chips to replace the G5, even if it required a digression back into 32 bit CPUs for a period of time. It had few other options. 

    Intel also wanted to get its x86 chips into iPad, but by that point Apple realized it could deliver its own customized ARM SoC and have one unified processor architecture for all of its iOS devices, and eventually build that into silicon it could use to power Macs. When did it figure that out? Probably not in 2010. But the radical ambition that drove A-series chips each year was justified by both iPad sales and iPhone sales, and eventually presented itself as an option and alternative to Intel. When I wrote this last year, the comments all complained I was nuts and way too optimistic. Yet here we are. 
    rundhvidchiaprairiewalkerwatto_cobratmay
  • How Apple Silicon Macs can supercharge computing in the 2020s

    avon b7 said:

    HarmonyOS is real. It shipped last year. Your claim was incorrect. 

    A 14 year old boy can "ship" a Linux distro. What Huawei claimed was that it had a replacement to Android that was better than Android; and that it would prefer to use it, but couldn't for some reason; and that at any moment it could flip a switch and send out phones with its own secretly finished OS on it and they would sell. These were all blatant lies. In a few minutes I could compile a full article full of outrageous lies Huawei has prattled off and the tech media and lapped up off the floor like dogs to vomit. Nobody ever calls them on out their lies. They're a bullshit company run by frauds financed by the PRC.  

    The platform is actually extremely sophisticated (hence over 13,000 APIs and 1,000+ software modules and the planned ability to scale over an all scenario setting). 

    It is wholly ironic that you bring up routers and TVs as this was given as an example of how HarmonyOS could work just a few weeks ago! This isn't pie in the sky! The head of Huawei's software operations gave the following example on stage. 

    It was of a home router coming under attack, detecting the attack and using the resources of the TV (its NPU specifically) to bolster its resources and thwart the attack. All in real time and using AI. And over an ultrafast wireless connection. 

    What an insane thing to say. A router protected from attack by the PRC! Who do you think attacks routers? The PRC. 

    [Irrelevant chatter]


    prairiewalkerwilliamlondonwatto_cobratmay
  • How Apple Silicon Macs can supercharge computing in the 2020s


    cloudguy said:
    Good grief. The Intel Core i3 chips that Apple puts in the entry level MacBook Air costs less to buy from Intel than it will cost TSMC to make the A14 chip. And the chip cost is only a fraction of the cost of the device. For example, the Qualcomm charged Google only $50 for the Snapdragon 765G that is in the $700 Pixel 5. I know that there are rumors that Apple will sell the ARM laptops starting at $799, but only because they want to sell more of them. The tradeoff is that Apple will have lower margins in return for that increased market share. That will make ARM-based Macs the equivalent of the iPhone SE 2020, for example. 
    This is completely false. Every word is dripping with ignorance.
    patchythepirateRayz2016tenthousandthingswilliamlondonwatto_cobratmaymacplusplus
  • How Apple Silicon Macs can supercharge computing in the 2020s

    There are two really big announcements that Apple will have to make in the next year that will shape the future of both Apple and the entire computer industry:
    1. A desktop scale CPU. Thus far all of the CPUs Apple has made have been targeted at mobile devices. They are limited by the power they use and the heat they generate. A desktop scale CPU can draw 100 watts or more (280 for a Threadripper). How will these CPUs compare with ones from Intel and AMD given that Apple's current mobile processors compete well with Intel's laptop processors?
    2. A discrete GPU. Apple's current GPUs are built into the processor. They are great for playing games on mobile devices but they are at least ten times slower than current discrete GPUs from AMD and NVIDIA. For ray tracing they are about a hundred times slower than the hardware based ray tracing in the current crop of GPUs. Will Apple integrate AMD GPUs into the Apple Silicon iMac or will they announce their own discrete GPU?
    Apple has already detailed why putting CPU and GPU cores on the same SoC using a shared memory architecture is an advantage, not a problem. 

    The fact that historical "integrated GPUs" from CPU vendors like Intel have been lessor performers is not intrinsic to the GPU not sitting on its own chip. 

    Any time you have separate chips linked by an interconnect you're going to have more of a bottleneck between them vs putting both on the same wafer with direct access to the same shared data.
    williamlondonpatchythepirateRayz2016seanjrundhvidroundaboutnowwatto_cobratmay