Apple planning to ditch Intel chips in Macs for its own custom silicon in 2020

1356789

Comments

  • Reply 41 of 176
    thttht Posts: 5,445member
    My Python 3.6 plotting script execution time:

    2015 rMBP15 w/Intel Core i7-4980HQ (2.8/4.0 GHz): 91.1 sec

    2017 iPad Pro 10.5 w/Apple A10X (2.3 GHz): 91.5 sec

    This is 45 W vs a 10 W envelope or so. Use Pythonista on iPad. Terminal on macOS. Don’t know if the Core i7-4980HQ actually turbo-ed to 4 GHz. Who knows. That’s why you do a lot of testing.
    bloggerblogcgWerksrandominternetpersonfastasleeph2pwatto_cobra
  • Reply 42 of 176

    The #1 selling Apple computer is the iPhone. Interaction (as referenced in the article) is most important between iPhone, iPad, Apple Watch, Apple TV and HomePod.

    In terms of units sold Macs (which are mostly used by creative professionals and the enterprise) account for about 6% of total Apple devices sold. Moving away from Intel would be stupid unless Apple made a MAJOR leap in processor performance over Intel’s offerings. That leap would be necessary to justify the software upgrade cycle that would necessarily follow the change in processor architecture.  I don’t see this happening by 2020 or even 2030.

    h2p
  • Reply 43 of 176
    maltzmaltz Posts: 454member
    As a NeXT/Apple alum you folks are blatantly ignorant of the meaning of Fat Binary. Fat binaries were the binaries of NeXTSTEP/Openstep that were built binaries of the OS to run natively on different hardware architectures instruction sets.

     Apple continues working on shoring up the custom ARM based CPUs of its own design and still licenses the IP in order to produce them has nothing to do with leaving macOS to fend for itself on ARM based only instruction sets.

    More importantly, the effort to create OS X even with decades of x86/PPC/Moto/SPARC expertise took 5 years to get a limped version out the door, and that was already with a platform native on x86. The Rosetta was a compatibility layer on top of it.

     The logical solution moving forward is for Apple to license IP from AMD to have them build custom ASIC designs of SoC APUs and use their discrete CPUs/GPUs with the upcoming Thunderbolt licensing [now royalty free] to have a custom Thunderbolt controller designed by Apple on their boards, that are compatible with AMD's x86 chipsets, thus freeing Apple from relying solely on Intel.
    Before you throw stones, you'd better look at your history, and see what else the term was also applied to.

    I'm aware of your definition, and the usage you cite. However, there are more.
    The term FAT binary is actually patented by NeXT now owned by Apple. That term, is the only use we should ever discuss.
    At the risk of lowering myself to your pedantic level, you cannot patent a term.

    Anyway, the term "fat binary" is a generic term -- from my brief Googling, NeXT's official name for such applications was "Multi-Architecture Binaries".  Regardless, Apple absolutely (also) used the term "fat binary" to describe applications compiled with both 68k and PPC code during that transition, so there's no reason to deny that usage.    During the PPC to Intel transition, Apple used the term "Universal Binary", I suppose to avoid confusion with the earlier 68k/PPC apps.
    docno42watto_cobra
  • Reply 44 of 176
    tipootipoo Posts: 1,142member
    I'm curious what becomes of their dedicated GPU models. Is this just to replace the CPU, or will they take the whole banana with their now fully custom GPU? 
    watto_cobra
  • Reply 45 of 176
    melgrossmelgross Posts: 33,510member
    As a NeXT/Apple alum you folks are blatantly ignorant of the meaning of Fat Binary. Fat binaries were the binaries of NeXTSTEP/Openstep that were built binaries of the OS to run natively on different hardware architectures instruction sets.

     Apple continues working on shoring up the custom ARM based CPUs of its own design and still licenses the IP in order to produce them has nothing to do with leaving macOS to fend for itself on ARM based only instruction sets.

    More importantly, the effort to create OS X even with decades of x86/PPC/Moto/SPARC expertise took 5 years to get a limped version out the door, and that was already with a platform native on x86. The Rosetta was a compatibility layer on top of it.

     The logical solution moving forward is for Apple to license IP from AMD to have them build custom ASIC designs of SoC APUs and use their discrete CPUs/GPUs with the upcoming Thunderbolt licensing [now royalty free] to have a custom Thunderbolt controller designed by Apple on their boards, that are compatible with AMD's x86 chipsets, thus freeing Apple from relying solely on Intel.
    That’s one way. But it still leaves Apple vulnerable to others. If that’s why they want to leave Intel (assuming this rumor is true, as we’ve heard it before), then this is just tying them to AMD. Not in the same way, but still an outside source.

    i still think that if Apple wants to do this, and I have no doubt they’re exploring it, then the best way is to not be tied to anyone other than to ARM, for now. But they can’t just up the performance of their A series, because emulation will kill them. But, as I’ve been suggesting for the past several years, they could add some instructions from x86 to their chips. From what I know, individual instructions aren’t patented, or copyrighted. The point is that recent research has shown that 80% of the slowdown caused by emulation if from a very small number of instructions. A dozen, or so, are responsible for most of it.

    if Apple could do that, and have the chip hand over to those when running x86 software, then most of the slowdown will just disappear. That would allow Apple to have a chip that ran x86 software directly, including the OS. As we saw from the transition from the 680xx chips to the PPC chips, while the new chips ran emulated software more slowly, after a couple of years, with newer chips, even those emulated programs ran faster than the 680xx series had, and new software ran much faster. I bought Apple’s 601 cars for my Quadra 950 back then. Very interesting results.
    fastasleepmuthuk_vanalingamwatto_cobra
  • Reply 46 of 176
    melgrossmelgross Posts: 33,510member
    macxpress said:
    I've been able to run Macs for work because of the ability to virtualize Windows. Not sure this bodes well for the future of Macs in certain business segments unless emulation performance under these new chips will be acceptable.
    There is a technology called x86 on ARM...perhaps this is your answer. Microsoft looks to be supporting this as well. 
    Except that it requires a rewrite of the software. Not the best way to do it.
    muthuk_vanalingam
  • Reply 47 of 176
    MacProMacPro Posts: 19,727member
    I've been able to run Macs for work because of the ability to virtualize Windows. Not sure this bodes well for the future of Macs in certain business segments unless emulation performance under these new chips will be acceptable.
    I've always wondered if Apple did this could there be a BTO option on higher end Macs that include an Intel CPU but not sure the cost would work out as Apple's buying power would be diminished greatly for Intel chips.


    cgWerkswatto_cobra
  • Reply 48 of 176
    MacProMacPro Posts: 19,727member

    Presumably they’ll unify iOS and macOS to a degree that the chipsets powering the entire product line are largely the same?
    I'd tend to agree with Soli on this.
  • Reply 49 of 176
    melgrossmelgross Posts: 33,510member

    tipoo said:
    creemail said:
    This means that ARM has almost caught x86. We can expect the A11X to be significantly greater in performance over the A11. I would guess and say at least 40-50% improved. In which this sets the stage for the A12, A12X, then to the A13/A13X (2020), in which Apple will rename the processors differently. My guess by 2020 we should expect at least 6500-7500 single core with a multiscore of 45,000-50,000. This is bound to happen...

    5x in just two years sounds pretty optimistic, even with how impressive they've been. Say they manage another 80% MP boost every year, we're more like at ~3x at 30K multicore if it manages very good scaling, but we've more often seen one year with a big boost, one year with a minor boost, alternatively. 

    Now, this is all without active cooling so far, so if we're talking about a clamshell or desktop with active cooling on top of their already impressive uArch designs maybe. I wonder what a Xeon W equivalent A series would look like...
    If Apple does do this, I would think that unless they can get Xeon chip equivalent at the same time as a less powerful version, they would have to start with possibly the Macbook, and work up from that over a period of several years. The Macbook uses a couple of versions of the “M” series between 4.5 and 6 Watts. That would be the place to start.
    h2pwatto_cobra
  • Reply 50 of 176
    melgrossmelgross Posts: 33,510member
    6toecat said:
    good to know now, as I can start planning my migration from Apple hardware / Final Cut Pro to Resolve on another platform. 
    Goody for you.
    racerhomie3SpamSandwichwatto_cobra
  • Reply 51 of 176
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    As a NeXT/Apple alum you folks are blatantly ignorant of the meaning of Fat Binary. Fat binaries were the binaries of NeXTSTEP/Openstep that were built binaries of the OS to run natively on different hardware architectures instruction sets.

     Apple continues working on shoring up the custom ARM based CPUs of its own design and still licenses the IP in order to produce them has nothing to do with leaving macOS to fend for itself on ARM based only instruction sets.

    More importantly, the effort to create OS X even with decades of x86/PPC/Moto/SPARC expertise took 5 years to get a limped version out the door, and that was already with a platform native on x86. The Rosetta was a compatibility layer on top of it.

     The logical solution moving forward is for Apple to license IP from AMD to have them build custom ASIC designs of SoC APUs and use their discrete CPUs/GPUs with the upcoming Thunderbolt licensing [now royalty free] to have a custom Thunderbolt controller designed by Apple on their boards, that are compatible with AMD's x86 chipsets, thus freeing Apple from relying solely on Intel.
    Before you throw stones, you'd better look at your history, and see what else the term was also applied to.

    I'm aware of your definition, and the usage you cite. However, there are more.
    The term FAT binary is actually patented by NeXT now owned by Apple. That term, is the only use we should ever discuss.
    You don't get to decide that. The term was used by Apple for both the migration to PPC, and the migration to Intel.
    edited April 2018 Solifastasleepwatto_cobra
  • Reply 52 of 176
    cgWerkscgWerks Posts: 2,952member
    tipoo said:
    Hopefully they can decouple themselves from Intels blunders and delays and really distinguish themselves that way. Look at everyone pooh poohing the 16GB limit because Intel doesn't support LPDDR4, while A series chips have for a few years, yet Apple gets the blame for it. And the fully Apple GPU will be just as interesting.
    Yes, a bit... but Apple's problems go far beyond Intel not advancing as some might like. I'd be perfectly happy (would would have been for years already) with a computer Apple ***COULD*** easily build using Intel chips currently available.

    While there is some truth to it, it's more an excuse for Apple's failure to execute.
    racerhomie3h2pdocno42
  • Reply 53 of 176
    racerhomie3racerhomie3 Posts: 1,264member
    I've been able to run Macs for work because of the ability to virtualize Windows. Not sure this bodes well for the future of Macs in certain business segments unless emulation performance under these new chips will be acceptable.
    The number who will buy a Mac for $500-900 will more than make up for that.
  • Reply 54 of 176
    melgrossmelgross Posts: 33,510member
    maltz said:
    As a NeXT/Apple alum you folks are blatantly ignorant of the meaning of Fat Binary. Fat binaries were the binaries of NeXTSTEP/Openstep that were built binaries of the OS to run natively on different hardware architectures instruction sets.

     Apple continues working on shoring up the custom ARM based CPUs of its own design and still licenses the IP in order to produce them has nothing to do with leaving macOS to fend for itself on ARM based only instruction sets.

    More importantly, the effort to create OS X even with decades of x86/PPC/Moto/SPARC expertise took 5 years to get a limped version out the door, and that was already with a platform native on x86. The Rosetta was a compatibility layer on top of it.

     The logical solution moving forward is for Apple to license IP from AMD to have them build custom ASIC designs of SoC APUs and use their discrete CPUs/GPUs with the upcoming Thunderbolt licensing [now royalty free] to have a custom Thunderbolt controller designed by Apple on their boards, that are compatible with AMD's x86 chipsets, thus freeing Apple from relying solely on Intel.
    Before you throw stones, you'd better look at your history, and see what else the term was also applied to.

    I'm aware of your definition, and the usage you cite. However, there are more.
    The term FAT binary is actually patented by NeXT now owned by Apple. That term, is the only use we should ever discuss.
    At the risk of lowering myself to your pedantic level, you cannot patent a term.

    Anyway, the term "fat binary" is a generic term -- from my brief Googling, NeXT's official name for such applications was "Multi-Architecture Binaries".  Regardless, Apple absolutely (also) used the term "fat binary" to describe applications compiled with both 68k and PPC code during that transition, so there's no reason to deny that usage.    During the PPC to Intel transition, Apple used the term "Universal Binary", I suppose to avoid confusion with the earlier 68k/PPC apps.
    No, but you can trademark it.
  • Reply 55 of 176
    racerhomie3racerhomie3 Posts: 1,264member

    The #1 selling Apple computer is the iPhone. Interaction (as referenced in the article) is most important between iPhone, iPad, Apple Watch, Apple TV and HomePod.

    In terms of units sold Macs (which are mostly used by creative professionals and the enterprise) account for about 6% of total Apple devices sold. Moving away from Intel would be stupid unless Apple made a MAJOR leap in processor performance over Intel’s offerings. That leap would be necessary to justify the software upgrade cycle that would necessarily follow the change in processor architecture.  I don’t see this happening by 2020 or even 2030.

    The Pros priced ($2000-20000) will use Intel
    The consumer level($500-1500) should use A series.
    h2p
  • Reply 56 of 176
    CPU is one thing Desktop display is anoyther. I do not plan using large screens or multi-display configuration in mobile way. Multiple apps side by side processing at the same time are still the way working wiith desktop apps. Also one screen app is mobile efficient - not desktop app UI design.
  • Reply 57 of 176
    cgWerkscgWerks Posts: 2,952member

    tipoo said:
    5x in just two years sounds pretty optimistic, even with how impressive they've been. Say they manage another 80% MP boost every year, we're more like at ~3x at 30K multicore if it manages very good scaling, but we've more often seen one year with a big boost, one year with a minor boost, alternatively.
    I think Intel is up against physics, aside from just keeping adding more cores (which needs software advance to take advantage of for the average user).
    Apple will hit that wall soon enough as well. Then we're down to arguments over architecture (remember, Apple didn't switch to Intel because it was a better architecture).

    Soli said:
    Presumably they’ll unify iOS and macOS to a degree that the chipsets powering the entire product line are largely the same?
    I would expect the chips to be distinct, and that iOS and macOS will never be unified. There will be crossover as we've seen from the start with macOS being stripped down and then rebuilt back up into iOS and efficiencies from iOS being brought into macOS, but I think that sharing—not unification—will continue to be at the core of these discrete OSes.
    I really, really hope you're right! That was Apple's great advantage (vs what Microsoft tried), but I've been getting worried Apple has been wavering on that.

    Basically, I see two scenarios for the Mac:
    1) The Mac has fallen behind because of Apple growth pains and bad decisions. They've recognized this and will now fix it. The future looks bright.

    2) The idea is to move on to the new iOS-thing and 'the future' and just put enough effort into the Mac to stave off chaos. In that case, the future looks bleak.

    All the data doesn't seem to fit either scenario though. This chip thing, while important, is more a behind the scenes thing, IMO. The important decisions are going to be made about the platform overall, software direction and quality, and whether they build the right hardware with whatever chips they use. While we might quibble over pricing and such, Intel has some perfectly capable chips right now that Apple hasn't been widely using (cf. iMac Pro, for example). Or, I'd be giddy for a bigger Mac Mini with a quad-core i7, updated tech, and adequate cooling. None of this even requires much 'innovation.' It simply requires a willingness.
    h2p
  • Reply 58 of 176
    Just to let you know Apple does not own ARM or has IP for many inventions to rely on. It is owned by SoftBank and Apple has to pay license fees to SoftBank. Read the news because from ignorant posts some people may think that Apple owns many designs. Well, no.
  • Reply 59 of 176
    cgWerkscgWerks Posts: 2,952member
    DuhSesame said:
    Maybe there’s finally a chance to get rid of the thermal throttling.  Intel’s x86 chips always runs higher than their TDP rating if you pushed them to the maximum level, never a friendly solution for a thinner device.
    The problem is more with the other components. That's a design problem for Apple, not Intel. I'll take a thicker device with adequate cooling.

    netrox said:
    It's bound to happen with the incredible progress the ARM processor has gone. It's extremely efficient and blazing fast. So, it would be nice to have a huge boost in performance when we have the software recompiled for ARM.
    Is it really, though, given the same work to do that we'd do with our desktops? The fastest seem to match the short-term speeds of Apple's mid-range Macs. I'm not sure I'd call that blazing fast and efficient. They are efficient given what they are designed for, but how much more efficient will they really be when it comes to an apples to apples comparison?

    cropr said:
    The moment we can no longer run an Intel based linux server in a VM on a Mac at a decent speed, will be the day that my company stops using Macs as development machines.  This is an absolute showstopper.  If we don't have this functionality locally on a development machine, all my developers will move to Intel machines with Linux (e.g. Dell XPS).
    The question is... does Apple really care? If they can replace those sales with a higher number of sales to some other market, possibly not. Apple doesn't seem to be thinking so long-term, big-picture anymore.

    Soli said:
    I have to assume that Intel-based Linux VMs on Macs are statistically insignificant to Apple.
    Possibly statistically insignificant on those pie-charts. However, far from insignificant in terms of big-picture impact. (whether the powers that be are thinking beyond the pie-chart, though, remains to be seen)

    spacekid said:
    Improve security like preventing software not sold through the App Store from running?
    That's one way to do it.
    racerhomie3muthuk_vanalingam
  • Reply 60 of 176
    wood1208wood1208 Posts: 2,913member
    Apple can easily control it's destiny and not at Intel's mercy. Processor mercy. Apple can refresh MAC every year with upgraded performance.
    edited April 2018
Sign In or Register to comment.