Apple's shift to ARM Mac at WWDC will define a decade of computing

24

Comments

  • Reply 21 of 75
    elijahgelijahg Posts: 2,759member
    elijahg said:
    mjtomlin said:
    Rayz2016 said:
    So if only 2% of Mac users are dual booting, then it’s possible that Apple isn’t too fussed about running Windows. 

    I don't think Apple gives a rats ass about the virtualization crowd... they can either buy a WIntel machine or continue using their current Mac for it. Apple's not going to hold its future roadmap hostage over this single issue.
    That's my issue with the modern Apple, around macOS at least. There're so many groups of people they supposedly don't give a rat's ass about that eventually most people that use their Mac for more than a Facebook machine fall into one of those groups. There are a lot of these "single issues" that Apple shouldn't give a crap about according to forum-goers here, but all those single issues add up. I seem to be getting hit by a lot of them right now. They can stomp over their iOS user base and no one cares because iOS devices don't need much compatibility or openness, and the size of the iOS market means it's worth devs keeping up. The same can't be said for macOS. Power users on Macs don't like being shat on by the company they've supported for several decades just because it's convenient for Apple to do so. The machines are getting more and more expensive but with less and less software and hardware features. And as much as I hate to say it, Windows is nowhere near as terrible as it used to be.
    I don't think Apple is trying to be all things to all people.  I could be wrong but I think the core markets for their Macs are software developers (specifically developers that develop for Apple's ecosystem), creative professionals, enterprise market.
    The risk is they end up offering nothing more than a Facebook machine. Apple hates enterprise users, so that's another group they are quite happy to put in the bin. IBM are pretty much the only major company using Macs; the ridiculous debacle around FCP X (Apple arrogantly telling studios how their workflow should be) made lots of studios switch to Avid on a PC. The BBC used to be all Macs, now there're barely any. I use open source software that's essential for my job, a lot of which uses OpenGL. Apple is slowly nixing OpenGL support, and without OpenGL and without virtualisation, the open source software I use will no longer work on a Mac. And if that happens, when replacement time for my Mac comes around, I won't be replacing it with another Mac - despite being a devoted Mac user for 25 years. There will be a Windows PC sat on my desk instead.
  • Reply 22 of 75
    karmadavekarmadave Posts: 369member
    I'm still skeptical that Apple's ARM designs will 'replace' and not 'augment' Intel in the Macintosh. However, one thing that ARM does give Apple is the ability to design it's own iOS and iPadOS compatible CPU without having to licensing or royalties to chip 'monopolies' such as Intel. IDK. Maybe Apple will use this as an opportunity to 're-imagine' the computing platform for the next 10-15 years. Certainly leveraging the mobile platform, is a good start, and incorporating 5G into ALL their platforms could be very interesting as well. Like a lot of people, I will be watching the WWDC20 keynote with keen interest this year.
    tmaydocno42watto_cobra
  • Reply 23 of 75
    swineone said:
    First of all, I could barely believe this is a DED piece; the only giveaway was its size, but still, I read through half of it before I went back to the start to check the author. A very welcome change from the usual rant-filled, foaming-at-the-mouth display of anger at anyone who doesn't unconditionally praise Apple. I may even read more articles by him in the future (usually, at the first sign of the above characteristics, I go back to the start, confirm it's a DED piece and close the tab).

    The transition to ARM has certainly been possible for a few years already, so one has to question what prompted it at this time, especially considering the decision certainly hasn't been made this year, or even last year (for those not familiar with it, it takes quite a few years to develop hardware; it's not like software where you can release a crappy version and update it later). Surely Intel's lack of progress in the last few years was a major motivator, but without an x86 emulation layer, the first couple of years are going to be rocky for the early adopter. I expect a repeat of the Intel transition where major apps (like Adobe's) took a while to be released. Now there's an extra issue with the loss of x86 virtualization, which despite what many people say, is going to be a factor to a large segment of the user base. Let's face it, there are many apps that are Windows only, from in-house apps to specialized apps such as a lot of stuff in the engineering field (e.g. SolidWorks, Altium Designer, etc.) Despite the general opinion, there are pro users who do things other than media editing.

    I'll go on a limb here and say that part of why Apple took so long is that they're perfecting an x86 emulation layer that's going to run, if not at close to native speed, fast enough to entice users of such software to stay on the Apple ecosystem. Additionally it's going to help with sales in the first few years, while certain important apps haven't yet migrated to ARM.

    Recall that Apple licensed the Rosetta layer for the PowerPC-to-Intel migration, but it was a much smaller company back then. They're huge now, and they even have their own compiler infrastructure in the form of LLVM; the expertise there might help to develop such an emulation layer.

    In closing, I wager a part of the announcement is going to be an x86 emulation layer with never-seen-before performance. I guess we'll see soon enough.
    This move has been in the works for over a decade since development began on the A-Series chips. Unlike the transition to INTEL Apple already has a robust hardware and software platform running on ARM: iPad, iPhone, AppleTV, AppleWatch and HomePod.

    I don't see x86 play much of any role in their priorities. Expect to be redirected to 3rd party virtualization software like Parallels and VmWare, which is where I already run Windows for basic PC technical support.

    The only thing in doubt at this point is how far along they are on scaling ARM up for the desktop. The A14 already matches a 2020 13" MBP (10th Gen Quad-Core i7) in speed, so it's probably safe to assume they've already scaled that up to replace the INTEL chips in every Mac with the possible exception of the MacPro (potentially using multiple chips in the device). I would also expect them to eventually scale up the graphics to eliminate the need for 3rd party GPUs below the MacPro, the only Mac with PCIe slots. If anything is for certain, Apple wants to bring all of the hardware development in-house.

    As with their last two processor transitions, I would expect to see a full conversion to ARM within 5-years. The new hardware will likely come sooner than later and accelerate, but OS support will continue on for both platforms until the INTEL user base has hit vintage status. As for third party developers, I would expect Microsoft Office and Adobe Creative Suite within the 1st year along with Apple's full line of software.
    watto_cobra
  • Reply 24 of 75
    karmadave said:
    I'm still skeptical that Apple's ARM designs will 'replace' and not 'augment' Intel in the Macintosh.
    What are you trying to say? Augment? The T-2 chip has probably taken that as far as it can go unless they are going to make ARM dGPUs.

    Remaining on INTEL is a curse and a blessing. It's also one of the reasons a lot of software that should be on the Mac is still Windows only.

    If A-Series benchmarks are right, Apple could get double the performance of a 2020 13" MBP QC-i7 with the same thermal and power envelope of the current machine, probably for less than they are paying INTEL now. That would make a compelling case for consumers, businesses and developers. They could even offer replaceable CPUs and GPUs for the high-end configurations or just the high-end desktops.

    Whatever they do, it will have to be compelling. Apple certainly has the means now that they dwarf INTEL.
    watto_cobra
  • Reply 25 of 75
    canukstormcanukstorm Posts: 2,700member
    elijahg said:
    elijahg said:
    mjtomlin said:
    Rayz2016 said:
    So if only 2% of Mac users are dual booting, then it’s possible that Apple isn’t too fussed about running Windows. 

    I don't think Apple gives a rats ass about the virtualization crowd... they can either buy a WIntel machine or continue using their current Mac for it. Apple's not going to hold its future roadmap hostage over this single issue.
    That's my issue with the modern Apple, around macOS at least. There're so many groups of people they supposedly don't give a rat's ass about that eventually most people that use their Mac for more than a Facebook machine fall into one of those groups. There are a lot of these "single issues" that Apple shouldn't give a crap about according to forum-goers here, but all those single issues add up. I seem to be getting hit by a lot of them right now. They can stomp over their iOS user base and no one cares because iOS devices don't need much compatibility or openness, and the size of the iOS market means it's worth devs keeping up. The same can't be said for macOS. Power users on Macs don't like being shat on by the company they've supported for several decades just because it's convenient for Apple to do so. The machines are getting more and more expensive but with less and less software and hardware features. And as much as I hate to say it, Windows is nowhere near as terrible as it used to be.
    I don't think Apple is trying to be all things to all people.  I could be wrong but I think the core markets for their Macs are software developers (specifically developers that develop for Apple's ecosystem), creative professionals, enterprise market.
    The risk is they end up offering nothing more than a Facebook machine. Apple hates enterprise users, so that's another group they are quite happy to put in the bin. IBM are pretty much the only major company using Macs; the ridiculous debacle around FCP X (Apple arrogantly telling studios how their workflow should be) made lots of studios switch to Avid on a PC. The BBC used to be all Macs, now there're barely any. I use open source software that's essential for my job, a lot of which uses OpenGL. Apple is slowly nixing OpenGL support, and without OpenGL and without virtualisation, the open source software I use will no longer work on a Mac. And if that happens, when replacement time for my Mac comes around, I won't be replacing it with another Mac - despite being a devoted Mac user for 25 years. There will be a Windows PC sat on my desk instead.
    OpenGL is done on the Mac.  You can bank on that.
    razorpitthtJWSCtmaydocno42watto_cobra
  • Reply 26 of 75
    swineone said:

    I'll go on a limb here and say that part of why Apple took so long is that they're perfecting an x86 emulation layer that's going to run, if not at close to native speed, fast enough to entice users of such software to stay on the Apple ecosystem. Additionally it's going to help with sales in the first few years, while certain important apps haven't yet migrated to ARM.

    Recall that Apple licensed the Rosetta layer for the PowerPC-to-Intel migration, but it was a much smaller company back then. They're huge now, and they even have their own compiler infrastructure in the form of LLVM; the expertise there might help to develop such an emulation layer.

    In closing, I wager a part of the announcement is going to be an x86 emulation layer with never-seen-before performance. I guess we'll see soon enough.
    Hmmm. Howsabout doing x86/x64 translation on the fly with an FPGA (or ASIC), similar to the Afterburner graphics accelerator in the Mac Pro?

    Plus, Apple’s pretty chummy with AMD (who has a cross license with Intel for x86 compatibility). Perhaps they could contract them to design such a lobotomized x86/x64 “coprocessor” for use in an Arm system. Apple could easily provide whatever low-level interconnection bus the two agree upon.

    Since Apple has an Architecture-Class Arm license, they could even extend the Arm instruction set to add instructions to enter/leave x86/64 mode, etc. Those instructions would be used by the OS for low-latency “environment switching”.

    Such a system might actually have full x86/x64 compatibility, at “hardware” speeds...


    docno42watto_cobra
  • Reply 27 of 75
    nubusnubus Posts: 382member
    This is going to be a long road to nowhere. Apple presented the Intel Mac in 2005 - it took Microsoft 6 years to release a native "Cocoa" version of Office which was truly faster than anything before. The same for Adobe for Photoshop... even Apple Final Cut Pro was first released in a fully native version in 2011. It took Apple 6 years. Before that we had "Carbon" which brought compatibility but with limited performance improvements compared to running on PowerPC.
     
    The benefit of x86 is that we stopped discussing CPU when comparing to Windows. Before that Apple had to battle "the megahertz myth" and being incompatible. With Intel the Mac became a question of quality/design/integration, and not about CPU or big-endian vs. little-endian etc. Apple could move us to AMD Ryzen and get reduced power consumption and faster systems while keeping us on x86. Apple could move more functionality to a T2 to make a x86+ platform or make their own GPU with more AI without doing a full CPU-switch.

    And why bother? Apple has been on 4-year update cycles for all their desktops and did 5 years of butterfly keyboards moving us from noisy to sticky. Apple has long lost the passion for the Mac. These days even FileMaker is getting more love and updates from Apple than most Macs. By moving the Mac to ARM they will kill the Hackintosh, make an excuse for hiding the Unix part, block us from installing our own software, and then merge iPad and Mac. Microsoft used to work like that.

    I would like to see is a platform that solves how knowledge workers are working - now and not in 1984. That should be the thing for Apple to solve. But as they didn't care about keyboards for 5 years, removed docking stations, removed MagSafe (do people at Apple sit at their desks all day?), and have given up on helping users with loads of documents (Finder and the desktop idea was designed for 400K disks and local storage) and running of web-applications is just not better on Mac. The CPU won't solve any of that.

    So why are they doing it? Probably someone asked "what would Steve do"... but remember that the first architectural change wasn't by Jobs - he was the one that decided that for Mac the CPU shouldn't be the thing to discuss and moved to x86.
    Joharelijahgbeeble42
  • Reply 28 of 75
    elijahg said:
    elijahg said:
    mjtomlin said:
    Rayz2016 said:
    So if only 2% of Mac users are dual booting, then it’s possible that Apple isn’t too fussed about running Windows. 

    I don't think Apple gives a rats ass about the virtualization crowd... they can either buy a WIntel machine or continue using their current Mac for it. Apple's not going to hold its future roadmap hostage over this single issue.
    That's my issue with the modern Apple, around macOS at least. There're so many groups of people they supposedly don't give a rat's ass about that eventually most people that use their Mac for more than a Facebook machine fall into one of those groups. There are a lot of these "single issues" that Apple shouldn't give a crap about according to forum-goers here, but all those single issues add up. I seem to be getting hit by a lot of them right now. They can stomp over their iOS user base and no one cares because iOS devices don't need much compatibility or openness, and the size of the iOS market means it's worth devs keeping up. The same can't be said for macOS. Power users on Macs don't like being shat on by the company they've supported for several decades just because it's convenient for Apple to do so. The machines are getting more and more expensive but with less and less software and hardware features. And as much as I hate to say it, Windows is nowhere near as terrible as it used to be.
    I don't think Apple is trying to be all things to all people.  I could be wrong but I think the core markets for their Macs are software developers (specifically developers that develop for Apple's ecosystem), creative professionals, enterprise market.
    The risk is they end up offering nothing more than a Facebook machine. Apple hates enterprise users, so that's another group they are quite happy to put in the bin. IBM are pretty much the only major company using Macs; the ridiculous debacle around FCP X (Apple arrogantly telling studios how their workflow should be) made lots of studios switch to Avid on a PC. The BBC used to be all Macs, now there're barely any. I use open source software that's essential for my job, a lot of which uses OpenGL. Apple is slowly nixing OpenGL support, and without OpenGL and without virtualisation, the open source software I use will no longer work on a Mac. And if that happens, when replacement time for my Mac comes around, I won't be replacing it with another Mac - despite being a devoted Mac user for 25 years. There will be a Windows PC sat on my desk instead.

    I thought this was a discussion regarding a CPU switch. Apple dropping OpenGL (OpenGL ES on Arm) support has absolutely nothing to do with OpenGL.

    Same thing with Virtualization. Arms do support Hypervisor-assisted Virtualization; so that’s yet another strawman.

    https://developer.arm.com/docs/100942/latest/aarch64-virtualization
    watto_cobrafastasleep
  • Reply 29 of 75
    GeorgeBMacGeorgeBMac Posts: 11,421member
    Good article!  Two thoughts:
    1)   The 90's were both highly successful and a disaster for personal computing.   The successes are now obvious.  The disaster is that inferior technologies like Windows and x86 processors prevailed (mostly through smart marketing efforts) over superior alternatives that have since died off.

    2)  As the article points out, Apple has successfully implemented a program to design their own chips and SOCs -- and that that is both very difficult and very expensive.   But, once begun it must be continued -- even though it will continue to be both very difficult and very expensive.   The danger lies in the possibility of Apple following other U.S. companies and focusing on stockholder well being over product well being.  If this program is stripped of funding (even a reduction) in order to provide better returns to stockholders, it could take down the whole company -- turn it into another RCA.
    elijahg
  • Reply 30 of 75
    tobiantobian Posts: 151member
    I believe Apple never was interested to make Intel Macs capable running Windows, the only reason for bootcamp was inevitability of people trying it, so they wanted to contol it at least (which is fine, since you can fry your Mac using custom drivers on windows!). Away X86 code will be, away the bootcamp will be. Windows on ARM is useless due to X86 bins incompatibility.
    elijahg said:
    Apple has never really liked gaming for some reason, which is odd since right from the Apple // days their computers were used to create some really innovative and popular games. The Oregon Trail, Marathon, Quake, Starcraft/Warcraft/WoW/Diablo to name a few that were created on MacOS or released with parity to their PC counterparts. I think Apple's almost embarrassed that gaming is so popular on iOS, which is an absolute contrast to gaming on the Mac. Inb4 people compare Apple Arcade games to immersive feature-length AAA titles, they're not even in the same ballpark. As noted previously, Valve has abandoned SteamVR for Mac, because Apple seems to be doing everything possible to put devs off gaming on the Mac.
    Steve's Apple never really liked gaming for the exact reason - it's waste of time. Steve was admiring creativity, education, arts, anything that moves the world forward. That's not the case of gaming, nor any other entertainment requiring your FULL attention (not the case of music, which might be inspiring, calming,.. while you DO something).
    Famous Halo presentation on 1998s keynote was just in sake of attracting publicity for Mac, otherwise Steve had no respect for any gaming, specially for feature-lenght AAA titles, that makes you create nothing but emotions for days (or weeks!). These candy crushes, that's something for creatives to have a few minutes break.. and back to work.

    I believe Steve had great sorrow of how his iDevices vision materialised in mostly entertainment, something he didn't stand for. He would never go for Arcade, TV+, or Card. I personally view it as a discursus as well.
    Rayz2016tmay
  • Reply 31 of 75
    nubusnubus Posts: 382member
    2)  As the article points out, Apple has successfully implemented a program to design their own chips and SOCs -- and that that is both very difficult and very expensive.   But, once begun it must be continued -- even though it will continue to be both very difficult and very expensive.   The danger lies in the possibility of Apple following other U.S. companies and focusing on stockholder well being over product well being.  If this program is stripped of funding (even a reduction) in order to provide better returns to stockholders, it could take down the whole company -- turn it into another RCA.
    Apple will hit more problems by doing this. Intel stalled due to engineering problems. Microsoft did Vista - and not due to lack of funding. It can happen to anyone - Apple failed on keyboards for 5 years, and on the Mac Pro for 10. Can't blame Microsoft for any of that. This won't make MS, or Apple the next RCA or Xerox, but is it worth taking the risk? If Apple truly cared about the Mac CPU then they would do more updates to their products - the CPU is such a non-issue at this point.
    elijahg
  • Reply 32 of 75
    elijahgelijahg Posts: 2,759member
    elijahg said:
    elijahg said:
    mjtomlin said:
    Rayz2016 said:
    So if only 2% of Mac users are dual booting, then it’s possible that Apple isn’t too fussed about running Windows. 

    I don't think Apple gives a rats ass about the virtualization crowd... they can either buy a WIntel machine or continue using their current Mac for it. Apple's not going to hold its future roadmap hostage over this single issue.
    That's my issue with the modern Apple, around macOS at least. There're so many groups of people they supposedly don't give a rat's ass about that eventually most people that use their Mac for more than a Facebook machine fall into one of those groups. There are a lot of these "single issues" that Apple shouldn't give a crap about according to forum-goers here, but all those single issues add up. I seem to be getting hit by a lot of them right now. They can stomp over their iOS user base and no one cares because iOS devices don't need much compatibility or openness, and the size of the iOS market means it's worth devs keeping up. The same can't be said for macOS. Power users on Macs don't like being shat on by the company they've supported for several decades just because it's convenient for Apple to do so. The machines are getting more and more expensive but with less and less software and hardware features. And as much as I hate to say it, Windows is nowhere near as terrible as it used to be.
    I don't think Apple is trying to be all things to all people.  I could be wrong but I think the core markets for their Macs are software developers (specifically developers that develop for Apple's ecosystem), creative professionals, enterprise market.
    The risk is they end up offering nothing more than a Facebook machine. Apple hates enterprise users, so that's another group they are quite happy to put in the bin. IBM are pretty much the only major company using Macs; the ridiculous debacle around FCP X (Apple arrogantly telling studios how their workflow should be) made lots of studios switch to Avid on a PC. The BBC used to be all Macs, now there're barely any. I use open source software that's essential for my job, a lot of which uses OpenGL. Apple is slowly nixing OpenGL support, and without OpenGL and without virtualisation, the open source software I use will no longer work on a Mac. And if that happens, when replacement time for my Mac comes around, I won't be replacing it with another Mac - despite being a devoted Mac user for 25 years. There will be a Windows PC sat on my desk instead.

    I thought this was a discussion regarding a CPU switch. Apple dropping OpenGL (OpenGL ES on Arm) support has absolutely nothing to do with OpenGL.

    Same thing with Virtualization. Arms do support Hypervisor-assisted Virtualization; so that’s yet another strawman.

    https://developer.arm.com/docs/100942/latest/aarch64-virtualization
    My reply was in the context of Apple dropping support for more and more features because "only n% of people use it", but there being so many features dropped that eventually most people are affected. "Apple dropping OpenGL support has nothing to do with OpenGL" what?

    Yes, you can virtualise an OS that uses ARM's instruction set, that doesn't include the mainstream version of Windows. The ARM version maybe, but the ARM version of Windows runs almost no Windows programs.
  • Reply 33 of 75
    razorpitrazorpit Posts: 1,796member
    elijahg said:
    Rayz2016 said:
    So if only 2% of Mac users are dual booting, then it’s possible that Apple isn’t too fussed about running Windows. 
    That 2% can be a comfort to people who jump ship but who aren't sure about macOS still. Also, what about virtualisation for Windows? I bet a lot more than 2% of people use virtualisation than dual-booting. 
    Agreed. This article should include the number of people that run Parallels, VM Ware, and Virtual Box. Other than being a hardcore gamer I can think of a reason to dual boot. I can run SolidWorks in Parallels, why would I want to cut off access to all of my Mac applications using dual boot when I don’t need to?
    elijahgwatto_cobra
  • Reply 34 of 75
    elijahgelijahg Posts: 2,759member

    tobian said:
    I believe Apple never was interested to make Intel Macs capable running Windows, the only reason for bootcamp was inevitability of people trying it, so they wanted to contol it at least (which is fine, since you can fry your Mac using custom drivers on windows!). Away X86 code will be, away the bootcamp will be. Windows on ARM is useless due to X86 bins incompatibility.
    elijahg said:
    Apple has never really liked gaming for some reason, which is odd since right from the Apple // days their computers were used to create some really innovative and popular games. The Oregon Trail, Marathon, Quake, Starcraft/Warcraft/WoW/Diablo to name a few that were created on MacOS or released with parity to their PC counterparts. I think Apple's almost embarrassed that gaming is so popular on iOS, which is an absolute contrast to gaming on the Mac. Inb4 people compare Apple Arcade games to immersive feature-length AAA titles, they're not even in the same ballpark. As noted previously, Valve has abandoned SteamVR for Mac, because Apple seems to be doing everything possible to put devs off gaming on the Mac.
    Steve's Apple never really liked gaming for the exact reason - it's waste of time. Steve was admiring creativity, education, arts, anything that moves the world forward. That's not the case of gaming, nor any other entertainment requiring your FULL attention (not the case of music, which might be inspiring, calming,.. while you DO something).
    Famous Halo presentation on 1998s keynote was just in sake of attracting publicity for Mac, otherwise Steve had no respect for any gaming, specially for feature-lenght AAA titles, that makes you create nothing but emotions for days (or weeks!). These candy crushes, that's something for creatives to have a few minutes break.. and back to work.

    I believe Steve had great sorrow of how his iDevices vision materialised in mostly entertainment, something he didn't stand for. He would never go for Arcade, TV+, or Card. I personally view it as a discursus as well.
    I'm not going to say whether he would or wouldn't do something, but people do have to view, watch and listen to art, movies and music, which is a "waste of time" just like games. And he was quite happy to provide software and machines for TV and film studios to ultimately enable people to "waste time" watching movies/TV. What about movies and TV shows on iTunes? That's very much entertainment consumption. Same with iPad until very recently, it was only really for consumption of media and entertainment. 

    But also gaming does drive GPU development, and GPUs are used for lots on Macs in addition to graphics (not so much in the Apple mobile ecosystem, as they have their own ASICs). 
    docno42watto_cobra
  • Reply 35 of 75
    razorpitrazorpit Posts: 1,796member
    elijahg said:
    mjtomlin said:
    Rayz2016 said:
    So if only 2% of Mac users are dual booting, then it’s possible that Apple isn’t too fussed about running Windows. 

    I don't think Apple gives a rats ass about the virtualization crowd... they can either buy a WIntel machine or continue using their current Mac for it. Apple's not going to hold its future roadmap hostage over this single issue.
    That's my issue with the modern Apple, around macOS at least. There're so many groups of people they supposedly don't give a rat's ass about that eventually most people that use their Mac for more than a Facebook machine fall into one of those groups. There are a lot of these "single issues" that Apple shouldn't give a crap about according to forum-goers here, but all those single issues add up. I seem to be getting hit by a lot of them right now. They can stomp over their iOS user base and no one cares because iOS devices don't need much compatibility or openness, and the size of the iOS market means it's worth devs keeping up. The same can't be said for macOS. Power users on Macs don't like being shat on by the company they've supported for several decades just because it's convenient for Apple to do so. The machines are getting more and more expensive but with less and less software and hardware features. And as much as I hate to say it, Windows is nowhere near as terrible as it used to be.
    You sure about that last part? https://www.engadget.com/microsofts-windows-10-updates-printer-bugs-000112943.html  ;)
    watto_cobra
  • Reply 36 of 75
    elijahgelijahg Posts: 2,759member
    nubus said:
    2)  As the article points out, Apple has successfully implemented a program to design their own chips and SOCs -- and that that is both very difficult and very expensive.   But, once begun it must be continued -- even though it will continue to be both very difficult and very expensive.   The danger lies in the possibility of Apple following other U.S. companies and focusing on stockholder well being over product well being.  If this program is stripped of funding (even a reduction) in order to provide better returns to stockholders, it could take down the whole company -- turn it into another RCA.
    Apple will hit more problems by doing this. Intel stalled due to engineering problems. Microsoft did Vista - and not due to lack of funding. It can happen to anyone - Apple failed on keyboards for 5 years, and on the Mac Pro for 10. Can't blame Microsoft for any of that. This won't make MS, or Apple the next RCA or Xerox, but is it worth taking the risk? If Apple truly cared about the Mac CPU then they would do more updates to their products - the CPU is such a non-issue at this point.
    You make several very good points. The iMacs were essentially abandoned between 2015 and 2019, (the iMac had a spec bump in 2017, and the 2019 one still has no T2 chip) and apart from minor attempts to fix the Macbook keyboards, they saw little love between 2016 and 2020. And of course the Mac Pro went 6 years before an update. So with the lack of attention toward Macs, why all of a sudden is there a CPU switch, when the current CPUs are fast enough for 99% of tasks? I wager a big reason is so Apple can increase their profit margins more on Macs, by using the already developed and paid for tech in the A-series chips in Macs, rather than giving Intel $800 per high-end CPU.

    And yes, when Apple hits the same physics wall as Intel, what happens then? Obviously ARM is a much more computationally efficient architecture than x86 so all things being equal, ARM will beat raw x86, but Apple will have to keep pouring billions into their chip R&D to keep ahead of Intel. Right now they can say they're fastest, but if they lose that crown they'll be wasting billions to catch up when they could have just stuck with Intel.
  • Reply 37 of 75
    elijahgelijahg Posts: 2,759member
    Another point no one seems to have factored is that Intel has tons of extensions to x86 which are essentially never compared in benchmarks against ARM. SSE1/2/3, AVX, MMX, Quicksync etc. Lots of cross-platform software uses these extensions to speed things up, and the code can be directly ported from the huge market that is Windows to the smaller Mac market. Software that makes use of those extensions is much much faster than that which uses general x86 instructions because they come with much less legacy cruft. If Apple switches, cross platform devs aren't going to waste time optimising their software to double the speed on the tiny Mac market, Mac users will just get an inferior experience, again.
  • Reply 38 of 75
    elijahgelijahg Posts: 2,759member
    razorpit said:
    elijahg said:
    mjtomlin said:
    Rayz2016 said:
    So if only 2% of Mac users are dual booting, then it’s possible that Apple isn’t too fussed about running Windows. 

    I don't think Apple gives a rats ass about the virtualization crowd... they can either buy a WIntel machine or continue using their current Mac for it. Apple's not going to hold its future roadmap hostage over this single issue.
    That's my issue with the modern Apple, around macOS at least. There're so many groups of people they supposedly don't give a rat's ass about that eventually most people that use their Mac for more than a Facebook machine fall into one of those groups. There are a lot of these "single issues" that Apple shouldn't give a crap about according to forum-goers here, but all those single issues add up. I seem to be getting hit by a lot of them right now. They can stomp over their iOS user base and no one cares because iOS devices don't need much compatibility or openness, and the size of the iOS market means it's worth devs keeping up. The same can't be said for macOS. Power users on Macs don't like being shat on by the company they've supported for several decades just because it's convenient for Apple to do so. The machines are getting more and more expensive but with less and less software and hardware features. And as much as I hate to say it, Windows is nowhere near as terrible as it used to be.
    You sure about that last part? https://www.engadget.com/microsofts-windows-10-updates-printer-bugs-000112943.html  ;)
    Yeeeeah that is a pretty stupid one :P At least you can uninstall the update to fix it.
  • Reply 39 of 75
    sphericspheric Posts: 2,560member
    elijahg said:
    nubus said:
    2)  As the article points out, Apple has successfully implemented a program to design their own chips and SOCs -- and that that is both very difficult and very expensive.   But, once begun it must be continued -- even though it will continue to be both very difficult and very expensive.   The danger lies in the possibility of Apple following other U.S. companies and focusing on stockholder well being over product well being.  If this program is stripped of funding (even a reduction) in order to provide better returns to stockholders, it could take down the whole company -- turn it into another RCA.
    Apple will hit more problems by doing this. Intel stalled due to engineering problems. Microsoft did Vista - and not due to lack of funding. It can happen to anyone - Apple failed on keyboards for 5 years, and on the Mac Pro for 10. Can't blame Microsoft for any of that. This won't make MS, or Apple the next RCA or Xerox, but is it worth taking the risk? If Apple truly cared about the Mac CPU then they would do more updates to their products - the CPU is such a non-issue at this point.
    You make several very good points. The iMacs were essentially abandoned between 2015 and 2019, (the iMac had a spec bump in 2017, and the 2019 one still has no T2 chip) and apart from minor attempts to fix the Macbook keyboards, they saw little love between 2016 and 2020. And of course the Mac Pro went 6 years before an update. So with the lack of attention toward Macs, why all of a sudden is there a CPU switch, when the current CPUs are fast enough for 99% of tasks?
    I'm sure I'm not alone among users who can easily redline a 3500€ laptop bought in late 2016 (a couple of instances of Alchemy and an Abbey Road saturator in MainStage can suffice) and force it to eventually throttle down on hot days — not ideal for stage use in crowded clubs, or on festival stages in summer. 

    For us, it's not even strictly about the power itself, but the tradeoffs between heat and power. Oh, and the fact that I usually can't get anywhere *near* ten hours of battery life. 

    I'm hoping that Apple's own designs can strike a better balance between thermals and horsepower. 
    docno42watto_cobra
  • Reply 40 of 75
    canukstormcanukstorm Posts: 2,700member
    elijahg said:
    nubus said:
    2)  As the article points out, Apple has successfully implemented a program to design their own chips and SOCs -- and that that is both very difficult and very expensive.   But, once begun it must be continued -- even though it will continue to be both very difficult and very expensive.   The danger lies in the possibility of Apple following other U.S. companies and focusing on stockholder well being over product well being.  If this program is stripped of funding (even a reduction) in order to provide better returns to stockholders, it could take down the whole company -- turn it into another RCA.
    Apple will hit more problems by doing this. Intel stalled due to engineering problems. Microsoft did Vista - and not due to lack of funding. It can happen to anyone - Apple failed on keyboards for 5 years, and on the Mac Pro for 10. Can't blame Microsoft for any of that. This won't make MS, or Apple the next RCA or Xerox, but is it worth taking the risk? If Apple truly cared about the Mac CPU then they would do more updates to their products - the CPU is such a non-issue at this point.
    You make several very good points. The iMacs were essentially abandoned between 2015 and 2019, (the iMac had a spec bump in 2017, and the 2019 one still has no T2 chip) and apart from minor attempts to fix the Macbook keyboards, they saw little love between 2016 and 2020. And of course the Mac Pro went 6 years before an update. So with the lack of attention toward Macs, why all of a sudden is there a CPU switch, when the current CPUs are fast enough for 99% of tasks? I wager a big reason is so Apple can increase their profit margins more on Macs, by using the already developed and paid for tech in the A-series chips in Macs, rather than giving Intel $800 per high-end CPU.

    And yes, when Apple hits the same physics wall as Intel, what happens then? Obviously ARM is a much more computationally efficient architecture than x86 so all things being equal, ARM will beat raw x86, but Apple will have to keep pouring billions into their chip R&D to keep ahead of Intel. Right now they can say they're fastest, but if they lose that crown they'll be wasting billions to catch up when they could have just stuck with Intel.
    The supposed benefits of Apple designing its own custom processor for the Mac isn't only about raw processing power.  A lot of it has to do with adding features to the SoC, that Apple won't be able to accomplish by staying with Intel, and that macOS apps will eventually be able to take advantage of.  Things like:  integrating AI / ML (Neural Engine), security (Secure Enclave), and cellular.  These are the features we know for now.  Who knows what other type of co-processors that Apple can eventually integrate onto the SoC.
    commentzillaGeorgeBMactechconcwatto_cobrarazorpit
Sign In or Register to comment.