Why Apple's move to an ARM Mac is going to be a bumpy road for some

1234579

Comments

  • Reply 121 of 162
    XedXed Posts: 2,785member
    wizard69 said:
    tht said:
    jdb8167 said:
    This has been done before with the Intel transition. The Rosetta emulator for going from PowerPC to Intel x86 was quite effective. The problem with the ARM transition is that the ARM CPU isn’t likely to be as much faster from the Intel CPU as was the Intel CPU from the PowerPC CPU. So any slowdowns will seem like very poor performance.
    The rumored Apple custom CPU for Mac is going to be fabbed on TSMC 5 nm. That's a 70% higher transistor density than the transistor densities afforded by Intel 10 nm or TSMC 7 nm, which AMD uses. And, Intel may still be fabbing desktop processors on 14 nm in the fall. That really boggles the mind and Apple at minimum should be jumping to AMD or their own custom processor ASAP.

    TSMC 5 nm has a 3x to 4x density advantage over Intel 14 nm. Like with AMD with TSMC 7 nm Zen 2 is crushing Intel 14 nm processors in perf/watt/$, Apple is going to crush AMD and Intel if they are fabbing these processors TSMC 5 nm while AMD is on 7 nm and Intel is using whatever they are using by then.
    Actually AMD is in the loop as far as 5nm goes!     Even with the 7nm node AMD is eating the hell out of Intel.  

    The only real problem with 5nm is heat which will require more care in SoC design.  Otherwise 5nm is looking to be HUGE.  ( couldn’t resist).  I’m fact if AMD gets there next year they will likely bury Intel, if Intel can’t get on anew node.  

    In any event for Apple 5nm will mean a massive performance bump of a huge power savings.  Likely they will split what 5nm offers down the middle to reduce power and increase performance.    This would mean an A14 that could reach mid to high end laptop performance.    That means better cooling than an iPhone but still amazing.  

    Im really excited about what A14 will be!   They could use all of that die space for far more functionality or instead shop a very small chip for phones.  It is just fascinating to think about the direction Apple will take.  
    Why keep assuming that the it will be an A-series chip in the Mac? Just like with the other half-dozen or so chips Apple designs in-house it seems very likely that this chip will be design for the Mac, not simply a beefed up iPhone chip.
    cgWerks
  • Reply 122 of 162
    dewmedewme Posts: 5,640member
    Just to be clear, I'm working for a startup that is building an ARM server designed to run at 20% of the power of a traditional architecture server.

    I've actually seen this kind of porting before and when you have. a well architected OS with a disciplined developer base, then it isn't so difficult. The example I think of Ian OS called OpenVMS. It went from a VAX chip, to an Alpha Chip to an Itanium Chip and now it can run on an Intel x86 chi. Indeed, all of them can run in the same cluster at the same time.

    If you want to be more "up to date", think of all of the chip architectures that Linux can run on. (I joke about up to date, as UNIX is actually older than OpenVMS).

    So, how? Because in a well architected OS you make sure that all the processor specific code is encapsulated in as few places as possible and you tell developers to use system calls and not write directly for the processor. If you do that, often just a recompile will work and away your software goes! No problem.

    Another interesting thing to note is that a great deal of software is now being developed on translated languages. That is, the software isn't compiled, it is translate din real times nd it is designed/built in a way that is portable. When you have that, it is only the underlying libraries that have to be modified to suit the processor and even then, it is usually just a few tweaks. This work is done by who ever provides the libraries, not the app developer.

    Apple have done it before, and they can do it again. I believe it'll be less of an issue from Intel to ARM than powerPC to Intel.

    As for Microsoft having developed Windows for ARM. Yes, they've done that. However, most of the apps still run in 32-bit mode and they have a translator that converts Intel code to ARM. They got that code when they did Windows NT, and it was originally done to enable Windows apps to run on Alpha and was developed by Digital Equipment Corporation (That became part of Compaq that became part of HP).

    Of course well-architected software, and especially operating system software, always wins the day if (or when) the time comes to harvest the quality attributes that drove the design of the architecture. The problem is that the folks who are typically paying for the design and construction of the architecture very rarely care to spend money today for what it takes to set themselves up for the future harvesting of benefits. This short-term investment strategy takes a toll in many cases with dead-end or duct tape and baling wire architectures. However, it's by no means universally bad, especially if you can overcome the survivorship bias that tends to forget the instances where software has been unnecessarily complexified or over-architected but never survived to harvest the long term benefits of the large upfront investment. It would be informative to see why seemingly good architectures did not survive.

    There is a lot of truth in the notion that most software architectural challenges can be solved simply by adding yet another layer of abstraction ... to which I would add the clarifier "as long as the hardware architecture and performance can keep up." But to clarify one point you made about "translator" implementations... the prevalent architectural abstraction pattern over the past 20 or so years isn't translation, it's just-in-time compilation (JIT). And yes, the source  code coming out of the compiler still does get compiled, but not to processor specific machine code but rather to an intermediate format (e.g., java byte code, .NET CIL) that is then compiled (JIT-ed) to processor specific machine code at run time by the virtual machine or runtime engine that is an abstraction on top of the native machine/processor. After the initial JIT phase at application start-up, the JIT-ed code runs natively, not translated or emulated, on the target processor. Portability across processor architectures is limited only by having the required virtual machine, runtime engine, and libraries for the target processor. But again, this still requires that applications that want to take advantage of this portability buy into the abstractions provided by the runtime model rather than going straight to the native processor. That's the gotcha.

    Everything is possible in software with the right amount of time, money, and accommodations in the hardware. Better yet, all software architectures are perfect if they don't have to accommodate one "little" thing: Legacy. If we could simply get everyone to dump all of their legacy crap into the Grand Canyon, write off decades of investment, ignore all sunk costs, and take a very long vacation from their businesses while we rearchitect everything with a perfect clean slate design we'd all be happy. Maybe they can tend to their herd of wild unicorns while we perfect the design and APIs on the best abstraction layer the world has ever seen. 

    Fyi - the initial prototypes of Windows NT targeted the Intel i860 RISC processor.
    cgWerks
  • Reply 123 of 162
    dysamoriadysamoria Posts: 3,430member
    I has a MacBook Air about 8 years ago and the W key was the hottest key on the keyboard. W was and remains today the main key you have to press to play most video games (as it means move forward). I suspect the Air's Intel CPU was under the W key. I abandoned laptops until this problem is permanently solved, and this ARM move might do the trick.
    W? Why not arrow keys? Mouse and keyboard?
  • Reply 124 of 162
    cgWerkscgWerks Posts: 2,952member
    Xed said:
    ... but I bet buy the time this comes to pass it won't even be an issue for the all but a handful of pertinacious people on this forum.
    Maybe among home users, but for business users (that can use Macs), they often have to use a VM to run certain apps. And, independent professionals also often have to use various Windows apps or utilities, even if it isn't their overall workflow. Sure, they could buy a separate PC, I guess. But, having that capability on-board is is sure nice.

    jdb8167 said:
    I wonder what happened to all the people who said the same thing during the PPC->Intel transition. 
    Yeah, but there it was a benefit (mostly*) in a couple of ways. It made the jump architecturally to multiple-cores and lower power usage. And, it gained hardware compatibility with 'the rest of the world.' (*with an initial loss of performance)

    This transition (assuming it happens) makes some hardware gains, but we end up going the opposite direction in terms of compatibility. The question is whether the tradeoffs are worth the losses. And, what those losses will entail. If we lose a huge chunk of software, utilities, etc. that won't be good for the platform.

    jdiamond said:
    For me, all the pain already happened with the move to Catalina. I already lost all the software that isn't already being actively developed, which for me was most of it. (And this wasn't old software either - dang that 2006 decision to have one Mac made with a 32-bit x86.). And actively developed software shouldn't have much difficulties hitting the ARM button on XCode. So AFAIK, Catalina is already forcing 99% of the transition. And I'm sure Apple is thinking they can plug software gaps with iOS apps. ...
    Hmm, I wouldn't lose many apps by moving to Catalina, but it has just been too much of a mess to really consider at this point. I'm hoping that stuff gets fixed. Yes, one would hope the active software packages would hit that button, but I highly doubt it is that simple. It is the 'plug the gaps with iOS apps' that has me worried, as a lot of developers might just try and take that route.

    Too many MacOS apps have already lost a lot of their Mac-magic, as more modern UI trends are mostly garbage. But, UI trends on iOS are mostly dumpster fires by comparison. It isn't that the hardware lacks power, or even iOS isn't up to the task (with a few exceptions), but the apps are inadequate, the workflows inefficient, and the UIs often suck.

    The thing is, though, I don't think that problem is going to go away on any platform. So if not, we're all just hanging on everywhere to the good stuff that remains.
  • Reply 125 of 162
    mdriftmeyermdriftmeyer Posts: 7,503member
    Xed said:
    wizard69 said:
    tht said:
    jdb8167 said:
    This has been done before with the Intel transition. The Rosetta emulator for going from PowerPC to Intel x86 was quite effective. The problem with the ARM transition is that the ARM CPU isn’t likely to be as much faster from the Intel CPU as was the Intel CPU from the PowerPC CPU. So any slowdowns will seem like very poor performance.
    The rumored Apple custom CPU for Mac is going to be fabbed on TSMC 5 nm. That's a 70% higher transistor density than the transistor densities afforded by Intel 10 nm or TSMC 7 nm, which AMD uses. And, Intel may still be fabbing desktop processors on 14 nm in the fall. That really boggles the mind and Apple at minimum should be jumping to AMD or their own custom processor ASAP.

    TSMC 5 nm has a 3x to 4x density advantage over Intel 14 nm. Like with AMD with TSMC 7 nm Zen 2 is crushing Intel 14 nm processors in perf/watt/$, Apple is going to crush AMD and Intel if they are fabbing these processors TSMC 5 nm while AMD is on 7 nm and Intel is using whatever they are using by then.
    Actually AMD is in the loop as far as 5nm goes!     Even with the 7nm node AMD is eating the hell out of Intel.  

    The only real problem with 5nm is heat which will require more care in SoC design.  Otherwise 5nm is looking to be HUGE.  ( couldn’t resist).  I’m fact if AMD gets there next year they will likely bury Intel, if Intel can’t get on anew node.  

    In any event for Apple 5nm will mean a massive performance bump of a huge power savings.  Likely they will split what 5nm offers down the middle to reduce power and increase performance.    This would mean an A14 that could reach mid to high end laptop performance.    That means better cooling than an iPhone but still amazing.  

    Im really excited about what A14 will be!   They could use all of that die space for far more functionality or instead shop a very small chip for phones.  It is just fascinating to think about the direction Apple will take.  
    Why keep assuming that the it will be an A-series chip in the Mac? Just like with the other half-dozen or so chips Apple designs in-house it seems very likely that this chip will be design for the Mac, not simply a beefed up iPhone chip.

    A Series chip means ARM based designs. There are no ARM compliant designs that will come remotely near AMD designs and it's going to get more brutally obvious this August, never mind Aug 2021 when Zen 4 arrives.
  • Reply 126 of 162
    XedXed Posts: 2,785member
    Xed said:
    wizard69 said:
    tht said:
    jdb8167 said:
    This has been done before with the Intel transition. The Rosetta emulator for going from PowerPC to Intel x86 was quite effective. The problem with the ARM transition is that the ARM CPU isn’t likely to be as much faster from the Intel CPU as was the Intel CPU from the PowerPC CPU. So any slowdowns will seem like very poor performance.
    The rumored Apple custom CPU for Mac is going to be fabbed on TSMC 5 nm. That's a 70% higher transistor density than the transistor densities afforded by Intel 10 nm or TSMC 7 nm, which AMD uses. And, Intel may still be fabbing desktop processors on 14 nm in the fall. That really boggles the mind and Apple at minimum should be jumping to AMD or their own custom processor ASAP.

    TSMC 5 nm has a 3x to 4x density advantage over Intel 14 nm. Like with AMD with TSMC 7 nm Zen 2 is crushing Intel 14 nm processors in perf/watt/$, Apple is going to crush AMD and Intel if they are fabbing these processors TSMC 5 nm while AMD is on 7 nm and Intel is using whatever they are using by then.
    Actually AMD is in the loop as far as 5nm goes!     Even with the 7nm node AMD is eating the hell out of Intel.  

    The only real problem with 5nm is heat which will require more care in SoC design.  Otherwise 5nm is looking to be HUGE.  ( couldn’t resist).  I’m fact if AMD gets there next year they will likely bury Intel, if Intel can’t get on anew node.  

    In any event for Apple 5nm will mean a massive performance bump of a huge power savings.  Likely they will split what 5nm offers down the middle to reduce power and increase performance.    This would mean an A14 that could reach mid to high end laptop performance.    That means better cooling than an iPhone but still amazing.  

    Im really excited about what A14 will be!   They could use all of that die space for far more functionality or instead shop a very small chip for phones.  It is just fascinating to think about the direction Apple will take.  
    Why keep assuming that the it will be an A-series chip in the Mac? Just like with the other half-dozen or so chips Apple designs in-house it seems very likely that this chip will be design for the Mac, not simply a beefed up iPhone chip.

    A Series chip means ARM based designs. There are no ARM compliant designs that will come remotely near AMD designs and it's going to get more brutally obvious this August, never mind Aug 2021 when Zen 4 arrives.
    That makes no sense. Apple has many "ARM based design" besides the A-series used in their iPhone and iPad.
  • Reply 127 of 162
    michelb76michelb76 Posts: 676member
    Can't be more bumpy then 10.15 has been so far.
  • Reply 128 of 162
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Xed said:
    Xed said:
    The concern with running Windows is very, very real.
    When I bought my grandson a MacBook Air for Christmas (because he asked for one) I made sure that it could run Windows if needed (I knew I would have to get a larger SSD, so I made sure to get a new 2017 MBA where it wasn't soldered in and could be upgraded).

    And, sure enough, after a month or two he stopped using it because he couldn't deal with MacOS -- it had too many differences from the Chrome and Windows that he knew and understood and just got frustrated with the Mac.

    So, I got a 500Gb Apple SSD, installed it along with a product key for Windows 10 and got them installed just in time for the schools to close and shift to cyber school.

    Both he and his mom are very grateful that he doesn't have to fight with MacOS and can just do his school work.
    That isn't the norm. Pretty much everyone using Macs are using macOS. Only on tech sites will you find a vocal minority, but still clearly a minority, that need to use Windows as a dual boot or VM.

    But none of this matters for those that do need or want to run Windows, because like your purchase to get an older Mac that has a removable SSD, people will simple buy the Mac HW that suits their needs. Pro Macs will surely be Intel for many years to come.

    Even if and when Apple no longer is making new Intel Macs you will still be able to buy older Macs, just like you did, to get the HW options you wanted… but I bet buy the time this comes to pass it won't even be an issue for the all but a handful of pertinacious people on this forum.

    No, running Windows on a MacBook is not the norm.  But then Mac users aren't the norm either.  Windows is the norm.

    And, no, I did not buy an "old" Mac (meaning used).   This was a brand new machine that I scooped up partly because it was reduced $300 and partly because it had better hardware than the new ones.
    You can't argue that because running Windows on a Mac isn't the norm and that running Macs aren't the norm that running Windows on a Mac is the norm. It doesn't work that way!

    You said a 2017 MacBook Air. That's an old Mac! That's not even one of the two Retina MacBook Airs that have come out since that MacBook was discontinued. Again, you bought out of date HW because the current HW didn't suit your needs… and that's exactly what the very tiny group of people will do when an Intel Macs are no longer available in many, many years but are oddly still holding onto needing an Intel Mac. At least you have a legitimate reason for wanting a Mac that is several years old with considerably inferior HW, just as people did many years ago when the Mac mini started coming with soldered RAM.


    LOL...  Sorry, but Windows is the norm.   Deal with it.

    And, as I said, twice:   the MacBook was bought new in December -- with 3 years of AppleCare+ (you can't get that on a used machine).   Call it "old" if you want.  But, to me, new is new and old is old.   I guess that's another thing you'll just have to deal with.  Is it "inferior hardware"?   Apple didn't think so in 2020 when they scrapped the crappy butterfly keyboards and went back to the scissors style used in the 2011-2017 MBA's.  They made a mistake and corrected it.

    edited April 2020
  • Reply 129 of 162
    GeorgeBMacGeorgeBMac Posts: 11,421member
    larryjw said:
    Rip the bandaid off. Legacy platforms need to upgrade or die off. It's a harsh and expensive stance to have, but it needs to happen. Software needs to be agile to stay relevent if not it needs to die. Im tired of all these "can't happen, I need legacy software" users. Technology moves too fast to wait for you, develop accordingly. 
    Legacy systems are not necessarily bad, but they need to be seen as living objects requiring "refactoring" to make improvements. My favorite saying in software is "A day without refactoring is a day without sunshine". If you were every into Agile programming, this phrase should be familiar to you. 

    We're seeing demand for Cobol programmers to handle legacy systems for payroll, unemployment compensation, etc due to economic collapse due to mismanagement of the response to coronavirus. 

    In my limited experience, the failure with Cobol has been that the legacy systems were initially written for the first version of Cobol pulling data from punch cards. Then they got updated, minimally, to pull data from magnetic tape, which looked and behaved like punch cards. When hard disks became available, they pulled data from hard disk files which looked like punch cards. When relational databases came on line, the programs pulled data from DB2 like it was punch cards. In the latter case, much Cobol code could be replaced by some simple SQL, replacing hundreds of lines of code with a handful of SQL. Some improvements might have been made for Y2K, but it seems minimal. 

    When newer versions of Cobol were released, with improved features, managers and programmers coded new programs using the same style and architectures of the original Cobol, and typically refused to use the newer features. And, they could have, at the same time, documented the code for later programmers. 

    Doing what is necessary to keep legacy code alive is not a failure of the legacy code, but a failure of the willingness of manager and programmers to improve code that works, to make the code work better or simply to improve the design without changing its functionality. 

    Continuous improvement, a la Deming, is neither hard nor costly, but always taking shortcuts is. 

    My experience was very much different:
    COBOL was easily able to keep up with newer technologies like relational databases -- and it did.  In fact, the COBOL infrastructure that those applications were built on is considered today to be new and revolutionary.  It's called:   Cloud Based Computing!

    Instead, COBOL applications were considered mission critical and through both necessity and history were wrapped in expensive shields of data security, software security, backups and multiple fail-safe solutions -- including high levels of maintainability.

    Those applications are alive and well today because of all the fail safes built into them.

    Conversely, PC and eventually server based applications were able to be built much more quickly and cheaply -- mostly because they did not include the failsafes built into the mission critical COBOL applications.  But, in addition, because they could be built by cheap, less highly trained coders.  So, for management mostly concerned with quarterly bonuses, the quick cheap PC based solutions were highly attractive.
    cgWerks
  • Reply 130 of 162
    canukstormcanukstorm Posts: 2,729member
    larryjw said:
    Rip the bandaid off. Legacy platforms need to upgrade or die off. It's a harsh and expensive stance to have, but it needs to happen. Software needs to be agile to stay relevent if not it needs to die. Im tired of all these "can't happen, I need legacy software" users. Technology moves too fast to wait for you, develop accordingly. 
    Legacy systems are not necessarily bad, but they need to be seen as living objects requiring "refactoring" to make improvements. My favorite saying in software is "A day without refactoring is a day without sunshine". If you were every into Agile programming, this phrase should be familiar to you. 

    We're seeing demand for Cobol programmers to handle legacy systems for payroll, unemployment compensation, etc due to economic collapse due to mismanagement of the response to coronavirus. 

    In my limited experience, the failure with Cobol has been that the legacy systems were initially written for the first version of Cobol pulling data from punch cards. Then they got updated, minimally, to pull data from magnetic tape, which looked and behaved like punch cards. When hard disks became available, they pulled data from hard disk files which looked like punch cards. When relational databases came on line, the programs pulled data from DB2 like it was punch cards. In the latter case, much Cobol code could be replaced by some simple SQL, replacing hundreds of lines of code with a handful of SQL. Some improvements might have been made for Y2K, but it seems minimal. 

    When newer versions of Cobol were released, with improved features, managers and programmers coded new programs using the same style and architectures of the original Cobol, and typically refused to use the newer features. And, they could have, at the same time, documented the code for later programmers. 

    Doing what is necessary to keep legacy code alive is not a failure of the legacy code, but a failure of the willingness of manager and programmers to improve code that works, to make the code work better or simply to improve the design without changing its functionality. 

    Continuous improvement, a la Deming, is neither hard nor costly, but always taking shortcuts is. 

    My experience was very much different:
    COBOL was easily able to keep up with newer technologies like relational databases -- and it did.  In fact, the COBOL infrastructure that those applications were built on is considered today to be new and revolutionary.  It's called:   Cloud Based Computing!

    Instead, COBOL applications were considered mission critical and through both necessity and history were wrapped in expensive shields of data security, software security, backups and multiple fail-safe solutions -- including high levels of maintainability.

    Those applications are alive and well today because of all the fail safes built into them.

    Conversely, PC and eventually server based applications were able to be built much more quickly and cheaply -- mostly because they did not include the failsafes built into the mission critical COBOL applications.  But, in addition, because they could be built by cheap, less highly trained coders.  So, for management mostly concerned with quarterly bonuses, the quick cheap PC based solutions were highly attractive.
    So in which category does DevOps fall into?
  • Reply 131 of 162
    XedXed Posts: 2,785member
    Xed said:
    Xed said:
    The concern with running Windows is very, very real.
    When I bought my grandson a MacBook Air for Christmas (because he asked for one) I made sure that it could run Windows if needed (I knew I would have to get a larger SSD, so I made sure to get a new 2017 MBA where it wasn't soldered in and could be upgraded).

    And, sure enough, after a month or two he stopped using it because he couldn't deal with MacOS -- it had too many differences from the Chrome and Windows that he knew and understood and just got frustrated with the Mac.

    So, I got a 500Gb Apple SSD, installed it along with a product key for Windows 10 and got them installed just in time for the schools to close and shift to cyber school.

    Both he and his mom are very grateful that he doesn't have to fight with MacOS and can just do his school work.
    That isn't the norm. Pretty much everyone using Macs are using macOS. Only on tech sites will you find a vocal minority, but still clearly a minority, that need to use Windows as a dual boot or VM.

    But none of this matters for those that do need or want to run Windows, because like your purchase to get an older Mac that has a removable SSD, people will simple buy the Mac HW that suits their needs. Pro Macs will surely be Intel for many years to come.

    Even if and when Apple no longer is making new Intel Macs you will still be able to buy older Macs, just like you did, to get the HW options you wanted… but I bet buy the time this comes to pass it won't even be an issue for the all but a handful of pertinacious people on this forum.

    No, running Windows on a MacBook is not the norm.  But then Mac users aren't the norm either.  Windows is the norm.

    And, no, I did not buy an "old" Mac (meaning used).   This was a brand new machine that I scooped up partly because it was reduced $300 and partly because it had better hardware than the new ones.
    You can't argue that because running Windows on a Mac isn't the norm and that running Macs aren't the norm that running Windows on a Mac is the norm. It doesn't work that way!

    You said a 2017 MacBook Air. That's an old Mac! That's not even one of the two Retina MacBook Airs that have come out since that MacBook was discontinued. Again, you bought out of date HW because the current HW didn't suit your needs… and that's exactly what the very tiny group of people will do when an Intel Macs are no longer available in many, many years but are oddly still holding onto needing an Intel Mac. At least you have a legitimate reason for wanting a Mac that is several years old with considerably inferior HW, just as people did many years ago when the Mac mini started coming with soldered RAM.
    LOL...  Sorry, but Windows is the norm.   Deal with it.

    And, as I said, twice:   the MacBook was bought new in December -- with 3 years of AppleCare+ (you can't get that on a used machine).   Call it "old" if you want.  But, to me, new is new and old is old.   I guess that's another thing you'll just have to deal with.  Is it "inferior hardware"?   Apple didn't think so in 2020 when they scrapped the crappy butterfly keyboards and went back to the scissors style used in the 2011-2017 MBA's.  They made a mistake and corrected it.
    That's a weird comment for an Apple-centric website. I guess I can expect you to bitch and moan once Apple starts released ARM Macs without BootCamp support.
  • Reply 132 of 162
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Xed said:
    Xed said:
    Xed said:
    The concern with running Windows is very, very real.
    When I bought my grandson a MacBook Air for Christmas (because he asked for one) I made sure that it could run Windows if needed (I knew I would have to get a larger SSD, so I made sure to get a new 2017 MBA where it wasn't soldered in and could be upgraded).

    And, sure enough, after a month or two he stopped using it because he couldn't deal with MacOS -- it had too many differences from the Chrome and Windows that he knew and understood and just got frustrated with the Mac.

    So, I got a 500Gb Apple SSD, installed it along with a product key for Windows 10 and got them installed just in time for the schools to close and shift to cyber school.

    Both he and his mom are very grateful that he doesn't have to fight with MacOS and can just do his school work.
    That isn't the norm. Pretty much everyone using Macs are using macOS. Only on tech sites will you find a vocal minority, but still clearly a minority, that need to use Windows as a dual boot or VM.

    But none of this matters for those that do need or want to run Windows, because like your purchase to get an older Mac that has a removable SSD, people will simple buy the Mac HW that suits their needs. Pro Macs will surely be Intel for many years to come.

    Even if and when Apple no longer is making new Intel Macs you will still be able to buy older Macs, just like you did, to get the HW options you wanted… but I bet buy the time this comes to pass it won't even be an issue for the all but a handful of pertinacious people on this forum.

    No, running Windows on a MacBook is not the norm.  But then Mac users aren't the norm either.  Windows is the norm.

    And, no, I did not buy an "old" Mac (meaning used).   This was a brand new machine that I scooped up partly because it was reduced $300 and partly because it had better hardware than the new ones.
    You can't argue that because running Windows on a Mac isn't the norm and that running Macs aren't the norm that running Windows on a Mac is the norm. It doesn't work that way!

    You said a 2017 MacBook Air. That's an old Mac! That's not even one of the two Retina MacBook Airs that have come out since that MacBook was discontinued. Again, you bought out of date HW because the current HW didn't suit your needs… and that's exactly what the very tiny group of people will do when an Intel Macs are no longer available in many, many years but are oddly still holding onto needing an Intel Mac. At least you have a legitimate reason for wanting a Mac that is several years old with considerably inferior HW, just as people did many years ago when the Mac mini started coming with soldered RAM.
    LOL...  Sorry, but Windows is the norm.   Deal with it.

    And, as I said, twice:   the MacBook was bought new in December -- with 3 years of AppleCare+ (you can't get that on a used machine).   Call it "old" if you want.  But, to me, new is new and old is old.   I guess that's another thing you'll just have to deal with.  Is it "inferior hardware"?   Apple didn't think so in 2020 when they scrapped the crappy butterfly keyboards and went back to the scissors style used in the 2011-2017 MBA's.  They made a mistake and corrected it.
    That's a weird comment for an Apple-centric website. I guess I can expect you to bitch and moan once Apple starts released ARM Macs without BootCamp support.

    Weird?   No, just reality.
    ... But then these days, for some, it seems reality can be anything they want it to be.
    edited April 2020 cgWerks
  • Reply 133 of 162
    Maybe I am one of the 2% referenced in the Article, but without x64-based software being able to run smoothly on the Mac, I (and my small firm) will have to migrate back to Lenovo laptops.  We have third-party proprietary software that only runs on Windows, so we use a combination of parallels and bootcamp to run this limited software.  Hopefully if macOS does shift to ARM, there will be some sort of efficient x64 emulation -- but I am not holding my breath.  Not to mention I will lament the loss of being able to play PC computer games on bootcamp on my MacBook Pro.  Eh, I will still use my iOS and tvOS devices just fine.
    GeorgeBMac
  • Reply 134 of 162
    bsimpsen said:
    horvatic said:
    Unless they can keep all the features that are currently in place including Bootcamp it would be the worst mistake Apple could do.
    Let me rephrase this for you...

    Unless Apple can keep that 2% of Bootcamp users, attracting 10% more people to the Mac would be the worst mistake Apple could do.

    In articles here and elsewhere in recent years, I've read that the cost of an Intel CPU in an entry level MacBook Pro is in the vicinity of $200, or as much as 20% of the cost to build the machine. The cost of the A family CPU in an iPad Pro is about $50, and outperforms the Intel part. I imagine Apple will pocket some of the $150 savings and pass the rest to the customer, along with the performance improvement (particularly battery life). Even if the price and performance gaps are smaller than rumored, they're still going to be significant.

    If you think that improvement in the price/performance ratio for Macs won't attract more people to the platform than Apple stands to lose (2%max), I'd like you to explain why.

    You make a reasonable point, but my only comment is that I would bet (I obviously don't have the data to support this -- only Apple does) that the average bootcamp user of a Mac is: (1) buying more expensive, higher margin Macs; (2) much more technically sophisticated and a die hard Apple fan (or they would just buy a Windows PC and forget about it); and (3) much more likely to be higher income and own many other higher margin Apple products.

    As I stated above, I am an avid bootcamp user.  I also have over 10 Apple TVs (and we frequently buy Apple content on them) at my houses, I have 4 ipads, multiple Macs, an apple watch, and many sets of airpads in my family.  Not to mention the firm that I own uses Macs (but we NEED bootcamp functionality).

    The counter-point, though, is even if I have to stop using Mac computers, I will probably still continue to use all of the other (higher margin) Apple products, so maybe Apple isn't that concerned.
    cgWerks
  • Reply 135 of 162
    XedXed Posts: 2,785member
    Maybe I am one of the 2% referenced in the Article, but without x64-based software being able to run smoothly on the Mac, I (and my small firm) will have to migrate back to Lenovo laptops.  We have third-party proprietary software that only runs on Windows, so we use a combination of parallels and bootcamp to run this limited software.  Hopefully if macOS does shift to ARM, there will be some sort of efficient x64 emulation -- but I am not holding my breath.  Not to mention I will lament the loss of being able to play PC computer games on bootcamp on my MacBook Pro.  Eh, I will still use my iOS and tvOS devices just fine.
    Run the numbers. Do you think Apple cares about losing you to Lenovo a decade from now when they can make better Macs at a lower cost and/or profit margin all with a schedule based on what Apple wants? All this works in Apple's favor, and by the time you're faced with the decision of having to move to an ARM Mac I'm betting this won't be a concern for you.
  • Reply 136 of 162
    cgWerkscgWerks Posts: 2,952member
    Maybe I am one of the 2% referenced in the Article, but without x64-based software being able to run smoothly on the Mac, I (and my small firm) will have to migrate back to Lenovo laptops.  We have third-party proprietary software that only runs on Windows, so we use a combination of parallels and bootcamp to run this limited software.  Hopefully if macOS does shift to ARM, there will be some sort of efficient x64 emulation -- but I am not holding my breath.  Not to mention I will lament the loss of being able to play PC computer games on bootcamp on my MacBook Pro.  Eh, I will still use my iOS and tvOS devices just fine.
    I'm in a similar camp. But, I think that 2% is baloney. My hunch is that is Bootcamp only, not X86 reliance.
  • Reply 137 of 162
    Mike WuertheleMike Wuerthele Posts: 6,910administrator
    cgWerks said:
    Maybe I am one of the 2% referenced in the Article, but without x64-based software being able to run smoothly on the Mac, I (and my small firm) will have to migrate back to Lenovo laptops.  We have third-party proprietary software that only runs on Windows, so we use a combination of parallels and bootcamp to run this limited software.  Hopefully if macOS does shift to ARM, there will be some sort of efficient x64 emulation -- but I am not holding my breath.  Not to mention I will lament the loss of being able to play PC computer games on bootcamp on my MacBook Pro.  Eh, I will still use my iOS and tvOS devices just fine.
    I'm in a similar camp. But, I think that 2% is baloney. My hunch is that is Bootcamp only, not X86 reliance.
    No hunch required -- this is addressed in the article.

    FTA: "According to AppleInsider data gleaned from our relationships with service departments, approximately 2% of Macs brought in for servicing at Apple have Windows installed on Boot Camp."
    fastasleep
  • Reply 138 of 162
    XedXed Posts: 2,785member
    cgWerks said:
    Maybe I am one of the 2% referenced in the Article, but without x64-based software being able to run smoothly on the Mac, I (and my small firm) will have to migrate back to Lenovo laptops.  We have third-party proprietary software that only runs on Windows, so we use a combination of parallels and bootcamp to run this limited software.  Hopefully if macOS does shift to ARM, there will be some sort of efficient x64 emulation -- but I am not holding my breath.  Not to mention I will lament the loss of being able to play PC computer games on bootcamp on my MacBook Pro.  Eh, I will still use my iOS and tvOS devices just fine.
    I'm in a similar camp. But, I think that 2% is baloney. My hunch is that is Bootcamp only, not X86 reliance.
    No hunch required -- this is addressed in the article.

    FTA: "According to AppleInsider data gleaned from our relationships with service departments, approximately 2% of Macs brought in for servicing at Apple have Windows installed on Boot Camp."
    My hunch is that Boot Camp is an atypical setup and is not the core reason why Macs are sold at all, like some here believe, but I will say that when I have had to send my Mac in for servicing I have either removed the drive (and replaced it with another that is blank or a basic install) or wiped the drive (after backing up) to help protect its contents. This would mean that my machine wouldn't be counted in that 2%… but it really shouldn't be since I always setup Windows as an option with a new Mac but then NEVER use it. But even if you did count people like me I doubt it would move the needle from that 2%.
    cgWerks
  • Reply 139 of 162
    cgWerkscgWerks Posts: 2,952member
    Mike Wuerthele said:
    No hunch required -- this is addressed in the article.

    FTA: "According to AppleInsider data gleaned from our relationships with service departments, approximately 2% of Macs brought in for servicing at Apple have Windows installed on Boot Camp."
    Yeah, but that's what I mean.
    Not that it's baloney in the sense of inaccurate data (as far as it goes), but in meaning for this discussion. So, say the 2% is accurate for Bootcamp. And, then what another X% (6, 12, 23?) that run Windows via Parallels, VMware, etc.? We'd have to know both and add them together for it to be relevant. If anything, only counting Bootcamp and then making a point based on that might lead us to the wrong conclusion.
  • Reply 140 of 162
    Mike WuertheleMike Wuerthele Posts: 6,910administrator
    cgWerks said:
    Mike Wuerthele said:
    No hunch required -- this is addressed in the article.

    FTA: "According to AppleInsider data gleaned from our relationships with service departments, approximately 2% of Macs brought in for servicing at Apple have Windows installed on Boot Camp."
    Yeah, but that's what I mean.
    Not that it's baloney in the sense of inaccurate data (as far as it goes), but in meaning for this discussion. So, say the 2% is accurate for Bootcamp. And, then what another X% (6, 12, 23?) that run Windows via Parallels, VMware, etc.? We'd have to know both and add them together for it to be relevant. If anything, only counting Bootcamp and then making a point based on that might lead us to the wrong conclusion.
    Bootcamp is the most friction-free solution of the above for most users, and delivers the most performance to users. Even if you assume 5x the amount of people are using other virtualization solutions, that's still about one in 10.

    I do have other data, but it is a small data set. It isn't even close to 5x the number of Bootcamp users using virtualization solutions, it's not even another 2%. But, for the sake of argument, I'm happy to use the 5x number.

    Will a shift to ARM antagonize these people? Sure.

    It is anywhere near to a majority? Not even remotely.

    Will it disproportionally affect AI readers versus the the larger Apple consumer and enterprise user base? Also yes. The "but what about me?" argument that gets tossed around here is valid, for that use case, and that use case alone. It isn't valid for the larger user base, yes, including enterprise, and that gets forgotten a lot around here.
    edited April 2020
Sign In or Register to comment.