Stop panicking about Apple's rumored switch from Intel to its own chips in the Mac

1235713

Comments

  • Reply 81 of 246
    rain22 said:
    jbdragon said:

    Maybe Apple is working on it, but they're not just going to move to it without a solid plan for everything. I think people are worrying a little too much. 
    The Rosetta solution for the intel switch was kinda brutal. 
    In a lot of cases, you had to repurchase all your software again and buy new peripherals for the drivers. 
    It took a few years for software to catch up - some never did.
    Like Keynote for example, version 9 had a lot of features that were lost in the re-write and never came back.
    I think it was what killed off Quark as well.

    It was a very disruptive workflow in general.
    The problem with keeping Pro macs on Intel is that they will get less and less resources and support- to the point where it just makes sense to switch to windows. Like a lot of printing and design houses did with the Intel switch. 

    The big question is the Adobe factor. 
    Quark killed Quark. Due to their hubris they refused to update and Adobe ate their lunch. I missed it for about 5 seconds, but honestly it was outdated across the board. 
    StrangeDaysfastasleep
  • Reply 82 of 246
    knowitallknowitall Posts: 1,028member
    melgross said:
    The idea that old, Intel Macs would keep working, offered as a reason to not be concerned about a shift from Intel chips seems shortsighted to me. It only delays the day when Macs will no longer be viable for those of us with a need for Intel compatibility. 

    I *have to* run Linux and Windows VMs and *want to* use a Mac; it's not the other way around for me and the loss of the ability to do the former would mean my very reluctant move away from Macs. Being able to run Docker containers on my Mac has further underscored my need for Intel compatibility. This is an upsetting thought to me, as I love using macOS as my desktop OS. Nearly every network and software engineer that I know who is a Mac user is in the same boat as I am. 

    It may well be that this will come to pass, or it may be that it's just a rumour. Perhaps (and this is my hope) Apple will use its own CPUs for low-end systems and Intel CPUs in Pro machines. In any case, I think it's a mistake to underestimate the loss to the Apple community of what I think will be an enormous number of developers. Being dismissive of people's concerns is a little heartless. 
    I bet it won't cause an enormous loss of developers, the same that the last two shifts didn't, nor any other move that promised to be the death of developers like Xcode was heralded to be. And, like I said, the mini and the MacBook are likely the first, as the A-series processor doesn't have any super-heavy lifters at present.

    I'm also pretty sure that you realize that you are not typical of the Mac using public. You are an outlier. That's not a bad thing, mind you, but also not a giant market segment for the company.
    While it seems to be that people are ignoring what I’ve been saying about this in threads over the years, it isn’t necessary for there to be any incompatibility between x86 software and an A series SoC, if Apple does what I’ve said they could. That’s to add the dozen, or so, instructions which have been shown to cause 80% of the slowdown when emulating one chip family on another. As far as I know, all of these instructions are open for anyone to use, individually. If Apple added them to their SoC, and had an automatic switch to those instructions when x86 software called for them, then no sloppy software virtualization would be required, as everything would work as usual. As Apple’s SoC speeded up over the next couple of years, x86 software would run faster. I saw that with my Quadra 950 after Apple released the PPC 601 card for the processor slot.

    for all we know, they’re working on that now. In fact, no one knows what at least 35% of the area on the SoC does. Apple doesn’t talk about more than a few areas. They could be experimenting with that now. Maybe we’ll see a “B” series of SoCs that will do this. Who knows? Otherwise, people thinking that Apple will do what Microsoft is doing is wrong, by my thinking. Microsoft requires a code rewrite for software to work on their universal system. Wouldn’t it be better if NO rewrite was required? You bet!
    Note that ‘sloppy software virtualization’ is required for instructions not added to the core. 
    It’s also possible to translate the code ahead of time once and run this translated program at full speed.
    fastasleep
  • Reply 83 of 246
    melgrossmelgross Posts: 30,871member
    jimh2 said:
    The one thing to keep in mind is that prices will never drop because Apple is designing their own chips, so the cost of Intel chips is irrelevent.
    That’s not true. Microprocessor Reports estimates Apple’s A series as being between $34-38 when they first come out. Intel’s chips, which often require external support chips as well, are much more expensive. If Apple saves even $50 in chip costs, that translates to an industry standard 2.5 -3.5 times addition to the product price. So that savings could be $125-175. Would Apple drop the price by that much? Maybe, maybe not. But likely they would drop it by something, possibly $100.
    dsward
  • Reply 84 of 246
    SoliSoli Posts: 8,193member
    Soli said:
    . . .
    [image]
    Need to work on your hex!
    Oops! I missed transcribing a digit. Fixed! Thanks.
    edited April 4
  • Reply 85 of 246
    SoliSoli Posts: 8,193member
    melgross said:
    jimh2 said:
    The one thing to keep in mind is that prices will never drop because Apple is designing their own chips, so the cost of Intel chips is irrelevent.
    That’s not true. Microprocessor Reports estimates Apple’s A series as being between $34-38 when they first come out. Intel’s chips, which often require external support chips as well, are much more expensive. If Apple saves even $50 in chip costs, that translates to an industry standard 2.5 -3.5 times addition to the product price. So that savings could be $125-175. Would Apple drop the price by that much? Maybe, maybe not. But likely they would drop it by something, possibly $100.
    Unlike with other architecture transitions, this one may be unique because they can offer better entry-level and power-efficient Pcs without the high barrier to entry for Intel's CULVs, like we see in the MacBook and MacBook Air.
  • Reply 86 of 246
    melgrossmelgross Posts: 30,871member
    MmmDee said:
    Awaiting the first anti-trust/monopoly lawsuit that Apple has yet to experience. Once Apple owns all the hardware and all the software running on their products, continuing their closed "ecosystem", the legal woes and end-of-product nightmare will begin. What a bad idea... I'm definitely not going down this path of self-destruction, been there, done that. Pity some businesses don't learn from history and are therefore doomed to repeat mistakes. More temporary profit for Apple, less choices for consumers.
    No, that’s not true. Monopoly doesn’t work that way. Apple would need to constitute a major majority of computers first. Even adding iPad in, that wouldn’t come close. Including the iPhone wouldn’t count, because they’re in a different category. If what you said were true, then Google should have already had their search declared a monopoly, and Android. Microsoft would have had Office called a monopoly, because it constitutes 95% of all office software.

    despite the nonsense we get from ignorant Android users, iOS isn’t closed any more than Android is. In fact, if we look at an OS ecosystem properly, and consider that software availability is the major measure of closed vs open, we’d see that iOS has far more diversity than does Android. Just because several manufacturers make Android phones, and are allowed to skin them doesn’t mean that it’s more open. If Apple had only its own software, only then could it really be called closed.
    StrangeDaysfastasleep
  • Reply 87 of 246
    knowitallknowitall Posts: 1,028member
    knowitall said:

    But, here's the thing: ARM is mainstream. Every iPhone, every iPad, every Samsung, nearly every smartphone has an ARM chip in it. Xcode is already set up to be the transition tool that developers need, so the friction will be extremely low.
    Mmm...  Does Xcode run on any current ARM devices?  Could it? Should it?
    I don’t know if it currently runs on ARM (it’s highly probable that Apple did that already), but it’s just a program as any other, so translating it is just a push on the Xcode button, no biggie.
    Why shouldn’t it be?
    We have a 12" and 10" iPad Pros with kb cases...  I'd like to be able to run Xcode on these.
    I suspect porting Xcode to iOS isn't easy, 12” is very little screen estate for such a program.
  • Reply 88 of 246
    melgrossmelgross Posts: 30,871member
    Snow Leopard shipped rock solid and stable. The OS has been castrated at the UI level and dumbed down since then.

    Still think that assuming it will be an ARM chipset is a bit premature. They could be doing an Apple version of x86 just like the A series are Apple’s version of ARM. There are any number of reasons to do it. Security, power management and performance tuning for the OS come to mind.
    Snow Leopard absolutely did nothing of the sort. It was stable at 10.6.8, and crappy before 10.6.5. There is this mythical (X=X-1) was better phenomenon going on, with a lot of dismissal of when the favorite wasn't so great.


    It’s the Golden Past effect. When looking back, most things, out of nostalgia, look better than they were.
  • Reply 89 of 246
    Mike WuertheleMike Wuerthele Posts: 3,224administrator
    rain22 said:
    jbdragon said:

    Maybe Apple is working on it, but they're not just going to move to it without a solid plan for everything. I think people are worrying a little too much. 
    The Rosetta solution for the intel switch was kinda brutal. 
    In a lot of cases, you had to repurchase all your software again and buy new peripherals for the drivers. 
    It took a few years for software to catch up - some never did.
    Like Keynote for example, version 9 had a lot of features that were lost in the re-write and never came back.
    I think it was what killed off Quark as well.

    It was a very disruptive workflow in general.
    The problem with keeping Pro macs on Intel is that they will get less and less resources and support- to the point where it just makes sense to switch to windows. Like a lot of printing and design houses did with the Intel switch. 

    The big question is the Adobe factor. 
    Quark killed Quark. Due to their hubris they refused to update and Adobe ate their lunch. I missed it for about 5 seconds, but honestly it was outdated across the board. 
    100% this. While Quark issued two updates in the last few years, it's probably too little, too late.
    fastasleep
  • Reply 90 of 246
    knowitallknowitall Posts: 1,028member

    I keep wondering if it would make sense for a Mac (and macOS) to support configurations with:
    1. an ARM CPU and an Intel CPU
    2. multiple ARM CPUs
    3. multiple ARM CPUs and an Intel CPU
    Maybe it need not be all or nothing?
    I think it does. Including an Intel CPU defeats the purpose (getting rid of Intel).
  • Reply 91 of 246
    dick applebaumdick applebaum Posts: 12,459member
    melgross said:

    While it seems to be that people are ignoring what I’ve been saying about this in threads over the years, it isn’t necessary for there to be any incompatibility between x86 software and an A series SoC, if Apple does what I’ve said they could. That’s to add the dozen, or so, instructions which have been shown to cause 80% of the slowdown when emulating one chip family on another. As far as I know, all of these instructions are open for anyone to use, individually. If Apple added them to their SoC, and had an automatic switch to those instructions when x86 software called for them, then no sloppy software virtualization would be required, as everything would work as usual. As Apple’s SoC speeded up over the next couple of years, x86 software would run faster. I saw that with my Quadra 950 after Apple released the PPC 601 card for the processor slot.

    for all we know, they’re working on that now. In fact, no one knows what at least 35% of the area on the SoC does. Apple doesn’t talk about more than a few areas. They could be experimenting with that now. Maybe we’ll see a “B” series of SoCs that will do this. Who knows? Otherwise, people thinking that Apple will do what Microsoft is doing is wrong, by my thinking. Microsoft requires a code rewrite for software to work on their universal system. Wouldn’t it be better if NO rewrite was required? You bet!
    A while back I did quite a bit of surfing about the x86 chip instruction set...  Long story short, the hardware translates CISC to RISC for execution.

    Could Apple benefit from reverse-engineering or licensing that capability (seems Apple has some leverage).

    Or would the dozen, or so instructions you mention, be good enough?
  • Reply 92 of 246
    roakeroake Posts: 585member
    I used to think that I would hate this move, as I use Parallels to run Windows and felt that a change to ARM would cripple this functionality.

    Now, I realize that I keep Windows around “just in case,” but never actually use it for anything.  If anything, the Windows partition is just a brick wasting space on the storage drive.

    As Microsoft shifts focus away from their local operating system and into Cloud computing, I think they are becoming far less rabid about controlling the local OS.  I believe this means that we will see increased support for MacOS by Microsoft, and will likely see Windows compatibility in some form when Apple switches to ARM.

    Short version, I’m excited about the potential move to ARM whether or not Microsoft comes along for the ride.
  • Reply 93 of 246
    SoliSoli Posts: 8,193member
    knowitall said:
    I keep wondering if it would make sense for a Mac (and macOS) to support configurations with:
    1. an ARM CPU and an Intel CPU
    2. multiple ARM CPUs
    3. multiple ARM CPUs and an Intel CPU
    Maybe it need not be all or nothing?
    I think it does. Including an Intel CPU defeats the purpose (getting rid of Intel).
    Why assume that Apple would be getting rid of Intel because they wanted to use an ARM-based Mac for, say, a new MacBook Air that was basically the 12" MacBook but running an Apple-designed chip? Do you really think there's an ARM-equivlenet that will work for the Mac Pro? I don't see the Pro-line being affected by this until such time as most people are instead bitching that Apple isn't moving fast enough to switch their high-end machines to to ARM.
  • Reply 94 of 246
    dick applebaumdick applebaum Posts: 12,459member
    Soli said:
    Soli said:
    . . .
    [image]
    Need to work on your hex!
    Oops! I missed transcribing a digit. Fixed! Thanks.
    Carry On!
    Soli
  • Reply 95 of 246
    melgrossmelgross Posts: 30,871member
    knowitall said:
    melgross said:
    The idea that old, Intel Macs would keep working, offered as a reason to not be concerned about a shift from Intel chips seems shortsighted to me. It only delays the day when Macs will no longer be viable for those of us with a need for Intel compatibility. 

    I *have to* run Linux and Windows VMs and *want to* use a Mac; it's not the other way around for me and the loss of the ability to do the former would mean my very reluctant move away from Macs. Being able to run Docker containers on my Mac has further underscored my need for Intel compatibility. This is an upsetting thought to me, as I love using macOS as my desktop OS. Nearly every network and software engineer that I know who is a Mac user is in the same boat as I am. 

    It may well be that this will come to pass, or it may be that it's just a rumour. Perhaps (and this is my hope) Apple will use its own CPUs for low-end systems and Intel CPUs in Pro machines. In any case, I think it's a mistake to underestimate the loss to the Apple community of what I think will be an enormous number of developers. Being dismissive of people's concerns is a little heartless. 
    I bet it won't cause an enormous loss of developers, the same that the last two shifts didn't, nor any other move that promised to be the death of developers like Xcode was heralded to be. And, like I said, the mini and the MacBook are likely the first, as the A-series processor doesn't have any super-heavy lifters at present.

    I'm also pretty sure that you realize that you are not typical of the Mac using public. You are an outlier. That's not a bad thing, mind you, but also not a giant market segment for the company.
    While it seems to be that people are ignoring what I’ve been saying about this in threads over the years, it isn’t necessary for there to be any incompatibility between x86 software and an A series SoC, if Apple does what I’ve said they could. That’s to add the dozen, or so, instructions which have been shown to cause 80% of the slowdown when emulating one chip family on another. As far as I know, all of these instructions are open for anyone to use, individually. If Apple added them to their SoC, and had an automatic switch to those instructions when x86 software called for them, then no sloppy software virtualization would be required, as everything would work as usual. As Apple’s SoC speeded up over the next couple of years, x86 software would run faster. I saw that with my Quadra 950 after Apple released the PPC 601 card for the processor slot.

    for all we know, they’re working on that now. In fact, no one knows what at least 35% of the area on the SoC does. Apple doesn’t talk about more than a few areas. They could be experimenting with that now. Maybe we’ll see a “B” series of SoCs that will do this. Who knows? Otherwise, people thinking that Apple will do what Microsoft is doing is wrong, by my thinking. Microsoft requires a code rewrite for software to work on their universal system. Wouldn’t it be better if NO rewrite was required? You bet!
    Note that ‘sloppy software virtualization’ is required for instructions not added to the core. 
    It’s also possible to translate the code ahead of time once and run this translated program at full speed.
    I call virtualization sloppy, because it very inefficient when applied to chip families. It’s efficient when applied to a different OS within the same chip family. You were either not around when Virtual PC, and earlier virtualization software was around, or you didn’t use it, but everything ran between 70-80% slower, even though, for the first few years, the PPC was significantly faster than x86.

    there are enough instruction in x86 and ARM that are either the same, or similar enough so as to not require virtualization. But these instructions that are different enough have been found to cause most of the slowdown. Those are hardware instructions, and adding them to the SoC would solve a lot of problems.
  • Reply 96 of 246
    dick applebaumdick applebaum Posts: 12,459member
    knowitall said:
    knowitall said:

    But, here's the thing: ARM is mainstream. Every iPhone, every iPad, every Samsung, nearly every smartphone has an ARM chip in it. Xcode is already set up to be the transition tool that developers need, so the friction will be extremely low.
    Mmm...  Does Xcode run on any current ARM devices?  Could it? Should it?
    I don’t know if it currently runs on ARM (it’s highly probable that Apple did that already), but it’s just a program as any other, so translating it is just a push on the Xcode button, no biggie.
    Why shouldn’t it be?
    We have a 12" and 10" iPad Pros with kb cases...  I'd like to be able to run Xcode on these.
    I suspect porting Xcode to iOS isn't easy, 12” is very little screen estate for such a program.
    There are MacBooks with the same size screen.
    fastasleep
  • Reply 97 of 246
    melgrossmelgross Posts: 30,871member

    melgross said:

    While it seems to be that people are ignoring what I’ve been saying about this in threads over the years, it isn’t necessary for there to be any incompatibility between x86 software and an A series SoC, if Apple does what I’ve said they could. That’s to add the dozen, or so, instructions which have been shown to cause 80% of the slowdown when emulating one chip family on another. As far as I know, all of these instructions are open for anyone to use, individually. If Apple added them to their SoC, and had an automatic switch to those instructions when x86 software called for them, then no sloppy software virtualization would be required, as everything would work as usual. As Apple’s SoC speeded up over the next couple of years, x86 software would run faster. I saw that with my Quadra 950 after Apple released the PPC 601 card for the processor slot.

    for all we know, they’re working on that now. In fact, no one knows what at least 35% of the area on the SoC does. Apple doesn’t talk about more than a few areas. They could be experimenting with that now. Maybe we’ll see a “B” series of SoCs that will do this. Who knows? Otherwise, people thinking that Apple will do what Microsoft is doing is wrong, by my thinking. Microsoft requires a code rewrite for software to work on their universal system. Wouldn’t it be better if NO rewrite was required? You bet!
    A while back I did quite a bit of surfing about the x86 chip instruction set...  Long story short, the hardware translates CISC to RISC for execution.

    Could Apple benefit from reverse-engineering or licensing that capability (seems Apple has some leverage).

    Or would the dozen, or so instructions you mention, be good enough?
    Well, I suppose Apple would begin at the low end, replacing the slow, low power M series. That would likely be easy, if they do what I’ve said. Apple’s A11 already surpasses that in direct processor testing.The next year, Apple could add more. As the ARM side became faster, the slowdowns caused by the numerous other instructions would be less and less of a problem, as all in all, they would constitute a small portion if that. In addition, with both instructions available, developers might very well, in their quest for speed, slowly add the possibility of their software to use native ARM instructions, where they do exist in parallel.
  • Reply 98 of 246
    wizard69wizard69 Posts: 12,536member
    nunzy said:
    Will this be a step towards uniting iOS and OSX?
    Lets hope not!   If you look at these two as tools they are about as different as a lathe and a table saw.   Both iOS and Mac OS are tools that allow users to get work done in different ways.  

    While i can see both OS's adopting features of the other (actually has been happening for some time) i dont see an integrated OS being a huge win for either user base.   Rather an integrated OS would be more cumbersome for both.  

    Rather i see Apple coming up with an API that makes dupporting users on both systems much easier.  Frankly there is much that Apple could do to integrate  the users machine but kee them ditinct.  Some of this is already done through iCloud but that is a bit of a joke.   

    In any event interesting times ahead.  Maybe in time i wont be pissed with Apple and their Mac division.  
    nunzy
  • Reply 99 of 246
    wizard69wizard69 Posts: 12,536member
    Oztotl said:
    First concern is the ability to run virtual Linux and Windows OS's under the new hardware. This is key to having only a souped up Macbook Pro to support multiple platforms and clients
    You do realize that Linux has been running on ARM for ages now?    
    fastasleep
  • Reply 100 of 246
    SoliSoli Posts: 8,193member
    knowitall said:
    I suspect porting Xcode to iOS isn't easy, 12” is very little screen estate for such a program.
    There are MacBooks with the same size screen.
    Running Xcode on a MacBook is only an option because it's running macOS, not because it's an ideal display size.
Sign In or Register to comment.