ARM Mac coming in first half of 2021, says Ming-Chi Kuo

123457

Comments

  • Reply 121 of 149
    mcdavemcdave Posts: 1,927member
    Rev2Liv said:
    mcdave said:
    larryjw said:
    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    Well, both ARM and X86 chips are micro coded. Could Apple microcode the x86 instruction set into the ARM?
    My guess is they already have, hence the lower than usual performance bump when the moved to Vortex (virtual cortex).

    Technically yes, but, 

    1. it would be a nightmare to graft ARM ISA onto x86 ISA since you’d end up with a bloated processor when an emulation layer or dynamic recompile would suffice.

    2. xCode 13 is processor agnostic within the confines of the Apple ecosystem.

    3. You still a PCI-E 3.0 chipset controller capable of MacOS grade I/O

    4. Lack of drivers for 3rd party devices.

    5. No AVX-512, QuickSync support or Intel supported SIMD extensions.


    1) I think the idea is you’d build both ISAs on common microcode though it would require a special relationship with Intel and maybe ultimately produce an Apple CPU ISA to complement their GPU & NPU ISAs. Any performance hit could drive native adoption.
    2) Yep.
    3) If AMD can do this, I think Apple can/have.
    4) Not with native x86 support.
    5) Overrated. Apple has custom silicon engines & a GPU which outperform these.
  • Reply 122 of 149
    melgrossmelgross Posts: 33,510member
    asdasd said:
    melgross said:
    melgross said:
    asdasd said:
    melgross said:
    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    This is the problem I’ve been wondering about for some time. While some people dismiss this as an issue, or in most cases, don’t even think about it (aren’t aware it is an issue), it’s the biggest issue apple will need to deal with. In previous changeovers, even Apple was very lax in getting their own big apps out. It took a year for them. It took a long time for Adobe and Microsoft, with their massive software, to come over too.

    ARM is not close to x86. It’s optimized for battery life over performance. Apple and ARM have made significant advances on that front, but the instruction sets are different enough. We know from previous attempts at emulation, that a processor family needs to be 5 times as powerful in order to be able to run software at the same speed as the family they’re emulating. This hasn’t changed. Microsoft supposedly does it now, with their “universal” sdk. But they don’t, really. They require software to be rewritten, and recompiled for ARM. And there have still been issues with performance, specific features and bugs.

    im not saying it can’t be done, because obviously it can. But if Apple is really going to release a device next year, there will either be significant limitations, or they’ve figured out a way around them. My suggestion, which no one here has ever commented on, from my memory, is to add a dozen x86 instructions to the chip. It’s been found that 80% of the slowdown between chip families is from about a dozen instructions. The chip, or OS, could hand that over to those when native x86 software needs them. Individual instructions aren’t patented, or copyrighted, as far as I know. If true, that would give Apple a way around the problem.
    I can’t emphasise enough that the vast majority of apps produced for the Mac right now will be a recompile and some won’t even need that. The compiler is doing the work if you use the Apple tool chain. 

    Dont confuse the compiled machine code with the higher level frameworks that might be used. 
    You shouldn’t, because it’s a myth. Yes, small apps can be recompiled, and will often work without more revision other than to fix bugs that always creep in when recompiling. But anything else needs to be rewritten.  It seems that people here forget the announcements that some developers have made during these transition periods in Apple demonstrations. They come out and announce how easy it was to get their massive app up and running in just a weekend. But then, it actually takes six months, or more, before that app is released. Why, because for the demo, they showed a few chosen features that worked out, after frenzied work. But the rest needed a good deal of work to function properly.

    its incrediably naive to think that this will be easy.
    I laugh when I read you making these insane claims.  If it takes a company 6 months to move only from one CPU to another CPU, where the APIs are identical, and they're not adding features/rewriting their applications in any meaningful way, then they're grossly incompetent.  Consider that when a new CPU/system comes out, they don't want to look bad compared to others, and it's a great time to add new features to show off the new CPU, assuming it gives any added power.  Short of them coding to the lowest-level CPU instructions for multimedia encoding/decoding or the like, assembly language is a major waste of developer time: incredibly few applications will get anything resembling the time/effort/cost back by rewriting anything in assembly language, because compilers more often than not do a better job with optimizations these days than the majority of the best hand-coded assembly language implementations.

    When you talk of developers needing to rewrite their applications because of a new CPU, even a new family/ISA where all that's changed is the CPU architecture? NOBODY REWRITES ANY MAJOR CODE FOR THAT!  That's insanely stupid for many reasons (including economically) and plain WRONG.  No major operating system is unable to be readily  rebuilt for other CPUs with only a very tiny amount of low-level assembly, and 99.99%  (admittedly, a number that's guessed, but no more than your no-proof data) of user applications in the last 10 years won't have any hand-coded assembly, regardless of their size. 

    The lowest-level language you'll find used in that percentage of apps these days is C.  While you can write it in a not-fully-portable manner because the C/C++ standards leave some details to the implementation of the vendors, it's actually very straightforward, even if you do convert between CPU architectures, to make the required changes to have the code work as it did before: it's not rocket science, it's not even interesting computer science.  If they do find a need to fix it to be portable, then they should ensure they never need to fix that again.

    For most applications, it truly is as simple as flipping an Xcode selection, as long as you've written in a reasonably portable way.  It's not hard in Objective-C/C/C++ to do so, and Swift makes it easier.  Switching from 32-bits to 64-bits had the biggest change with Apple changing from regular floats to double floats for CGFloats, but that's not nearly the hardest thing to fix.  You don't know what you're talking about.

    Yes, it took Apple a YEAR to bring Final Cut over. More than a half year to bring LOGIC pro over. It took both Adobe and Microsoft a year as well. It took WolframAlpha (the company doing the demo) about 7 months to move over. There are numerous other examples.

    of course, by your thinking, they are all incompetent.
    You realise this isn’t the same thing, right? In the os 9 to OS X days devs had to change the api they were calling (a process called carbonisation) from the old OS 9 code to a subset of this api, which for some devs was difficult because they used some low level and often dangerous features that were removed.

    in the move to 64 bit some lower level c type structures had to be changed. That’s done now. 

     To move to arm today, if the developers are using the Apple compilers, there should be no change in the api. No change in the api means no work. 
    Oh my lord. It’s amazing how dreamy eyed some people are. It’s always work.
  • Reply 123 of 149
    melgrossmelgross Posts: 33,510member

    mcdave said:
    melgross said:

    mcdave said:
    larryjw said:
    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    Well, both ARM and X86 chips are micro coded. Could Apple microcode the x86 instruction set into the ARM?
    My guess is they already have, hence the lower than usual performance bump when the moved to Vortex (virtual cortex).
    mcdave said:
    ...yet more proprietary lockdown upgrade fatigue...?
    ARM Aarch64 ISA is no more proprietary than x86. What’s your point.
    No. It was less because companies are running out of headroom in chip development. You’ve kept up with what’s happening, haven’t you? You know that Moore’s Law is pretty much kaput. Apple isn’t immune from that.

    x86 is proprietary, as is ARM. You need to read more about all of this instead of just making things up from your imagination.
    Upon reflection the A10-13 single core performance has been tapering consistently despite node reductions. I’m assuming power takes priority over performance.

    You realise you’re agreeing with my second point.melgross said:
    You do realize that you’ve just given in to my points?
  • Reply 124 of 149
    melgrossmelgross Posts: 33,510member

    mcdave said:
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    Google Docs proved Office was overkill. It also taught everyone that Pages is adequate & if your output is PDF, the creation tool is irrelevant.
    That’s totally untrue. For an online service, it’s doing well. But Microsoft’s 365 is doing very well.
  • Reply 125 of 149
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    They should have.
  • Reply 126 of 149
    tht said:
    wizard69 said:
    hexclock said:
    red oak said:
    An "A15X" type 5 nm chip in a laptop is going to blow the doors off anything offered by Intel

    Batteries in the MBP are 2-2.5x the size of iPad Pros.   Plus the thermal envelope of the MBPs are much greater.    It will allow Apple to dramatically increase the number of cores plus boost the clock frequency.    It is going to be something to behold 

    This is the laptop I want 

    How about a dual processor Mac Mini?
    The future is chiplet technology putting as many cores as you need in one package.  We don’t need a dual socket Mini we just need for Apple to pay attention to the Mini because even today the Mini is a joke.  

    In anyevent if you are not up on the latest tech read up on chiplet tech and the way AMD is using the tech to beat Intel senseless.  AMD’s high end Threadripper no comes with up to 64 cores in one package.  That is 64 cores and 128 threads.  These chips will not fit into the Mini but simply highlight chiplet tech.  

    Frankly the newest Ryzen mobile chips could do wonders in a Mini As these are 8 core 16 thread at the top end with a very good GPU all of 45 watts.  No chiplets in Ryzen mobile either just one well designed processor chip.   AMD has a whole range of desktop chips too, but the point is the Mini could easily move forward if Apple really wanted to and that is on X86.   On ARM we probably could see even more CPU cores.    The interesting thing about ARM is that Apple can tailor it in any way Apple sees fit.  I would not be surprised at all to see far more AI acceleration cores than regular CPU cores.  It is an interesting time in processor design and CPU cores are now just a portion of the overall chip.    Apple could easily put out a 16 core ARM based processor if they wanted, at this point it comes down to if they want to allocate the die space.  
    I don’t think anyone is going to put two sockets into a Mac mini, nor are they are going to put a lot of cores in it. Apple should have updated it with Core i7-9700 and 9900 processors, 8-core 65W processors at varying frequencies, and dropped the 4-core from the lineup for the holiday quarter, or at least next month, and that’s basically just about what it can do with what 65 W TDP processors the Mac mini is designed for. Who knows, maybe it’ll happen next month. Don’t think Intel is releasing 10 nm desktop chips until the fall.

    AMD’s big advantage over Intel is that are using a mature TSMC 7 nm fab while Intel is struggling to get their 10 nm to produce. The two are about equivalent in transistor densities, it’s just TSMCs has actually been working for about 2 years now. They are about a factor of 2x in transistors over Intel’s 14 nm fab, which is now on its 5 year I think. 6th year? That’s a huge advantage that is pretty hard to overcome if not impossible.

    Apple already does MCM packaging with their SoCs. The RAM and SoC are layered on top of each other in iPhone SoC packaging. If Apple is making macOS/ARM laptops, I’m thinking they will design it like iPad chips and logic boards. They are going to make it as much as a System-in-Package as possible, and RAM could be in-package. It will only be in the high end machines that require a lot of RAM, discrete GPUs, and or slots where they’ll break it out of the packaging.

    Apple isn’t in the business of selling server chips, yet I guess, so all the fancy MCM integration techniques like AMD’s chiplets, switched fabrics, in-package IO or Intel’s Foveros/EMIB, really aren’t going to come into play. They aren’t going to be making >500 mm^2 server chips or equivalent with MCM packaging. AMD’s MCM strategy is to not make gigantic chips either, as they don’t have their own fabs, and they need to maximize their production efficiency.

    I don’t think Apple needs to do what AMD is doing. I think they will be fine with multi-million unit runs of 8-core w/x GPU performance and 16-core w/2x GPU performance for laptops and lower end desktops. For the higher end desktops, there needs to be some thought. If they sell like 1 million high end machines a year, a special run with a monolithic die and a discrete GPU could be perfectly reasonable for them.

    So, an extension of what they do between iPhone SoCs and high end iPad SoCs. They’ll add CPU core complexes, GPU cores, memory channels, neural engine cores, other IO as required for the machine it is going into. The die sizes aren’t going to be bigger than 250 mm^2, maybe 300 mm^2.


    edited February 2020
  • Reply 127 of 149
    tht said:
    wizard69 said:
    hexclock said:
    red oak said:
    An "A15X" type 5 nm chip in a laptop is going to blow the doors off anything offered by Intel

    Batteries in the MBP are 2-2.5x the size of iPad Pros.   Plus the thermal envelope of the MBPs are much greater.    It will allow Apple to dramatically increase the number of cores plus boost the clock frequency.    It is going to be something to behold 

    This is the laptop I want 

    How about a dual processor Mac Mini?
    The future is chiplet technology putting as many cores as you need in one package.  We don’t need a dual socket Mini we just need for Apple to pay attention to the Mini because even today the Mini is a joke.  

    In anyevent if you are not up on the latest tech read up on chiplet tech and the way AMD is using the tech to beat Intel senseless.  AMD’s high end Threadripper no comes with up to 64 cores in one package.  That is 64 cores and 128 threads.  These chips will not fit into the Mini but simply highlight chiplet tech.  

    Frankly the newest Ryzen mobile chips could do wonders in a Mini As these are 8 core 16 thread at the top end with a very good GPU all of 45 watts.  No chiplets in Ryzen mobile either just one well designed processor chip.   AMD has a whole range of desktop chips too, but the point is the Mini could easily move forward if Apple really wanted to and that is on X86.   On ARM we probably could see even more CPU cores.    The interesting thing about ARM is that Apple can tailor it in any way Apple sees fit.  I would not be surprised at all to see far more AI acceleration cores than regular CPU cores.  It is an interesting time in processor design and CPU cores are now just a portion of the overall chip.    Apple could easily put out a 16 core ARM based processor if they wanted, at this point it comes down to if they want to allocate the die space.  
    I don’t think anyone is going to put two sockets into a Mac mini, nor are they are going to put a lot of cores in it. Apple should have updated it with Core i7-9700 and 9900 processors, 8-core 65W processors at varying frequencies, and dropped the 4-core from the lineup for the holiday quarter, or at least next month, and that’s basically just about what it can do with what 65 W TDP processors the Mac mini is designed for. Who knows, maybe it’ll happen next month. Don’t think Intel is releasing 10 nm desktop chips until the fall.

    AMD’s big advantage over Intel is that are using a mature TSMC 7 nm fab while Intel is struggling to get their 10 nm to produce. The two are about equivalent in transistor densities, it’s just TSMCs has actually been working for about 2 years now. They are about a factor of 2x in transistors over Intel’s 14 nm fab, which is now on its 5 year I think. 6th year? That’s a huge advantage that is pretty hard to overcome if not impossible.

    Apple already does MCM packaging with their SoCs. The RAM and SoC are layered on top of each other in iPhone SoC packaging. If Apple is making macOS/ARM laptops, I’m thinking they will design it like iPad chips and logic boards. They are going to make it as much as a System-in-Package as possible, and RAM could be in-package. It will only be in the high end machines that require a lot of RAM, discrete GPUs, and or slots where they’ll break it out of the packaging.

    Apple isn’t in the business of selling server chips, yet I guess, so all the fancy MCM integration techniques like AMD’s chiplets, switched fabrics, in-package IO or Intel’s Foveros/EMIB, really aren’t going to come into play. They aren’t going to be making >500 mm^2 server chips or equivalent with MCM packaging. AMD’s MCM strategy is to not make gigantic chips either, as they don’t have their own fabs, and they need to maximize their production efficiency.

    I don’t think Apple needs to do what AMD is doing. I think they will be fine with multi-million unit runs of 8-core w/x GPU performance and 16-core w/2x GPU performance for laptops and lower end desktops. For the higher end desktops, there needs to be some thought. If they sell like 1 million high end machines a year, a special run with a monolithic die and a discrete GPU could be perfectly reasonable for them.

    So, an extension of what they do between iPhone SoCs and high end iPad SoCs. They’ll add CPU core complexes, GPU cores, memory channels, neural engine cores, other IO as required for the machine it is going into. The die sizes aren’t going to be bigger than 250 mm^2, maybe 300 mm^2.


    According to some rumors, Intel is skipping 10nm for the desktop and going straight to 7nm
  • Reply 128 of 149
    mcdavemcdave Posts: 1,927member
    melgross said:

    mcdave said:
    melgross said:

    mcdave said:
    larryjw said:
    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    Well, both ARM and X86 chips are micro coded. Could Apple microcode the x86 instruction set into the ARM?
    My guess is they already have, hence the lower than usual performance bump when the moved to Vortex (virtual cortex).
    mcdave said:
    ...yet more proprietary lockdown upgrade fatigue...?
    ARM Aarch64 ISA is no more proprietary than x86. What’s your point.
    No. It was less because companies are running out of headroom in chip development. You’ve kept up with what’s happening, haven’t you? You know that Moore’s Law is pretty much kaput. Apple isn’t immune from that.

    x86 is proprietary, as is ARM. You need to read more about all of this instead of just making things up from your imagination.
    Upon reflection the A10-13 single core performance has been tapering consistently despite node reductions. I’m assuming power takes priority over performance.

    You realise you’re agreeing with my second point.melgross said:
    You do realize that you’ve just given in to my points?
    I wasn’t trying to be contrary in the first place. I think you agreed with my second point first.
  • Reply 129 of 149
    thttht Posts: 5,443member
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    They should have.
    Once Google released Docs, making it free for consumers, and whatever it costs for a business license for their suite of office automation stuff, the room for iWork, or any other office suite other than MS Office which is the “standard” for most large businesses and schools, there was no room for iWork to be anything other than a free office suite of apps for Apple customers. Keynote is the best presentation software though.

    I don’t think there was a real window to begin with. Petabytes, whatever absurdly high amount, of files is in MS Office formats. That legacy drives Office usage in perpetuity. Not sure how people get off that train other than for MS to go out of business for other reasons. Making iWork popular would be a pretty difficult if not impossible road to take. If Google decides to stop doing GSuite stuff, that would open a window too, but it isn’t going to happen unless something bad happens to Google.

    Apple must get MS to port Office to macOS/ARM. Office is the standard office automation tool for large businesses. They won’t be selling ARM Macs to these places without it because all those old files are in Office file formats.

    It’s the same story for Adobe. An absurdly high amount of storage is in Adobe app formats. Compatibility is hugely important, and it is imperative for them to get Adobe to port as fast as possible.
  • Reply 130 of 149
    asdasdasdasd Posts: 5,686member
    melgross said:
    asdasd said:
    melgross said:
    melgross said:
    asdasd said:
    melgross said:
    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    This is the problem I’ve been wondering about for some time. While some people dismiss this as an issue, or in most cases, don’t even think about it (aren’t aware it is an issue), it’s the biggest issue apple will need to deal with. In previous changeovers, even Apple was very lax in getting their own big apps out. It took a year for them. It took a long time for Adobe and Microsoft, with their massive software, to come over too.

    ARM is not close to x86. It’s optimized for battery life over performance. Apple and ARM have made significant advances on that front, but the instruction sets are different enough. We know from previous attempts at emulation, that a processor family needs to be 5 times as powerful in order to be able to run software at the same speed as the family they’re emulating. This hasn’t changed. Microsoft supposedly does it now, with their “universal” sdk. But they don’t, really. They require software to be rewritten, and recompiled for ARM. And there have still been issues with performance, specific features and bugs.

    im not saying it can’t be done, because obviously it can. But if Apple is really going to release a device next year, there will either be significant limitations, or they’ve figured out a way around them. My suggestion, which no one here has ever commented on, from my memory, is to add a dozen x86 instructions to the chip. It’s been found that 80% of the slowdown between chip families is from about a dozen instructions. The chip, or OS, could hand that over to those when native x86 software needs them. Individual instructions aren’t patented, or copyrighted, as far as I know. If true, that would give Apple a way around the problem.
    I can’t emphasise enough that the vast majority of apps produced for the Mac right now will be a recompile and some won’t even need that. The compiler is doing the work if you use the Apple tool chain. 

    Dont confuse the compiled machine code with the higher level frameworks that might be used. 
    You shouldn’t, because it’s a myth. Yes, small apps can be recompiled, and will often work without more revision other than to fix bugs that always creep in when recompiling. But anything else needs to be rewritten.  It seems that people here forget the announcements that some developers have made during these transition periods in Apple demonstrations. They come out and announce how easy it was to get their massive app up and running in just a weekend. But then, it actually takes six months, or more, before that app is released. Why, because for the demo, they showed a few chosen features that worked out, after frenzied work. But the rest needed a good deal of work to function properly.

    its incrediably naive to think that this will be easy.
    I laugh when I read you making these insane claims.  If it takes a company 6 months to move only from one CPU to another CPU, where the APIs are identical, and they're not adding features/rewriting their applications in any meaningful way, then they're grossly incompetent.  Consider that when a new CPU/system comes out, they don't want to look bad compared to others, and it's a great time to add new features to show off the new CPU, assuming it gives any added power.  Short of them coding to the lowest-level CPU instructions for multimedia encoding/decoding or the like, assembly language is a major waste of developer time: incredibly few applications will get anything resembling the time/effort/cost back by rewriting anything in assembly language, because compilers more often than not do a better job with optimizations these days than the majority of the best hand-coded assembly language implementations.

    When you talk of developers needing to rewrite their applications because of a new CPU, even a new family/ISA where all that's changed is the CPU architecture? NOBODY REWRITES ANY MAJOR CODE FOR THAT!  That's insanely stupid for many reasons (including economically) and plain WRONG.  No major operating system is unable to be readily  rebuilt for other CPUs with only a very tiny amount of low-level assembly, and 99.99%  (admittedly, a number that's guessed, but no more than your no-proof data) of user applications in the last 10 years won't have any hand-coded assembly, regardless of their size. 

    The lowest-level language you'll find used in that percentage of apps these days is C.  While you can write it in a not-fully-portable manner because the C/C++ standards leave some details to the implementation of the vendors, it's actually very straightforward, even if you do convert between CPU architectures, to make the required changes to have the code work as it did before: it's not rocket science, it's not even interesting computer science.  If they do find a need to fix it to be portable, then they should ensure they never need to fix that again.

    For most applications, it truly is as simple as flipping an Xcode selection, as long as you've written in a reasonably portable way.  It's not hard in Objective-C/C/C++ to do so, and Swift makes it easier.  Switching from 32-bits to 64-bits had the biggest change with Apple changing from regular floats to double floats for CGFloats, but that's not nearly the hardest thing to fix.  You don't know what you're talking about.

    Yes, it took Apple a YEAR to bring Final Cut over. More than a half year to bring LOGIC pro over. It took both Adobe and Microsoft a year as well. It took WolframAlpha (the company doing the demo) about 7 months to move over. There are numerous other examples.

    of course, by your thinking, they are all incompetent.
    You realise this isn’t the same thing, right? In the os 9 to OS X days devs had to change the api they were calling (a process called carbonisation) from the old OS 9 code to a subset of this api, which for some devs was difficult because they used some low level and often dangerous features that were removed.

    in the move to 64 bit some lower level c type structures had to be changed. That’s done now. 

     To move to arm today, if the developers are using the Apple compilers, there should be no change in the api. No change in the api means no work. 
    Oh my lord. It’s amazing how dreamy eyed some people are. It’s always work.
    Well yes. There’s a switch in Xcode so theres that work for apps not already uploaded as bitcode. 

    This thread shows a major problem with the world; opinions presented as facts.  The idea that every opinion is equally worthwhile. 

    Please show your workings out. What exact api do you expect developers will have to change when building in Xcode? I’m saying the compiler will do the work, or Apple will have failed. 

    So what APIs won’t port? 
    edited February 2020
  • Reply 131 of 149
    tht said:
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    They should have.
    Once Google released Docs, making it free for consumers, and whatever it costs for a business license for their suite of office automation stuff, the room for iWork, or any other office suite other than MS Office which is the “standard” for most large businesses and schools, there was no room for iWork to be anything other than a free office suite of apps for Apple customers. Keynote is the best presentation software though.

    I don’t think there was a real window to begin with. Petabytes, whatever absurdly high amount, of files is in MS Office formats. That legacy drives Office usage in perpetuity. Not sure how people get off that train other than for MS to go out of business for other reasons. Making iWork popular would be a pretty difficult if not impossible road to take. If Google decides to stop doing GSuite stuff, that would open a window too, but it isn’t going to happen unless something bad happens to Google.

    Apple must get MS to port Office to macOS/ARM. Office is the standard office automation tool for large businesses. They won’t be selling ARM Macs to these places without it because all those old files are in Office file formats.

    It’s the same story for Adobe. An absurdly high amount of storage is in Adobe app formats. Compatibility is hugely important, and it is imperative for them to get Adobe to port as fast as possible.
    Microsoft has always made money on Office for Mac, so no persuasion required beyond Microsoft looking out for Microsoft.

    ”Porting” will be as easy as clicking a checkbox to also build for ARM (Apple will likely require fat binaries anyway) building it, run a BVT and shipping it. Trivial.  Microsoft has experience making code cross-CPU portable by default so unless Apple does something stupid by changing behaviors of APIs from Intel MacOS this is less involved than investigating a crash bug in Word if they get one.
    asdasd
  • Reply 132 of 149
    melgrossmelgross Posts: 33,510member
    asdasd said:
    melgross said:
    asdasd said:
    melgross said:
    melgross said:
    asdasd said:
    melgross said:
    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    This is the problem I’ve been wondering about for some time. While some people dismiss this as an issue, or in most cases, don’t even think about it (aren’t aware it is an issue), it’s the biggest issue apple will need to deal with. In previous changeovers, even Apple was very lax in getting their own big apps out. It took a year for them. It took a long time for Adobe and Microsoft, with their massive software, to come over too.

    ARM is not close to x86. It’s optimized for battery life over performance. Apple and ARM have made significant advances on that front, but the instruction sets are different enough. We know from previous attempts at emulation, that a processor family needs to be 5 times as powerful in order to be able to run software at the same speed as the family they’re emulating. This hasn’t changed. Microsoft supposedly does it now, with their “universal” sdk. But they don’t, really. They require software to be rewritten, and recompiled for ARM. And there have still been issues with performance, specific features and bugs.

    im not saying it can’t be done, because obviously it can. But if Apple is really going to release a device next year, there will either be significant limitations, or they’ve figured out a way around them. My suggestion, which no one here has ever commented on, from my memory, is to add a dozen x86 instructions to the chip. It’s been found that 80% of the slowdown between chip families is from about a dozen instructions. The chip, or OS, could hand that over to those when native x86 software needs them. Individual instructions aren’t patented, or copyrighted, as far as I know. If true, that would give Apple a way around the problem.
    I can’t emphasise enough that the vast majority of apps produced for the Mac right now will be a recompile and some won’t even need that. The compiler is doing the work if you use the Apple tool chain. 

    Dont confuse the compiled machine code with the higher level frameworks that might be used. 
    You shouldn’t, because it’s a myth. Yes, small apps can be recompiled, and will often work without more revision other than to fix bugs that always creep in when recompiling. But anything else needs to be rewritten.  It seems that people here forget the announcements that some developers have made during these transition periods in Apple demonstrations. They come out and announce how easy it was to get their massive app up and running in just a weekend. But then, it actually takes six months, or more, before that app is released. Why, because for the demo, they showed a few chosen features that worked out, after frenzied work. But the rest needed a good deal of work to function properly.

    its incrediably naive to think that this will be easy.
    I laugh when I read you making these insane claims.  If it takes a company 6 months to move only from one CPU to another CPU, where the APIs are identical, and they're not adding features/rewriting their applications in any meaningful way, then they're grossly incompetent.  Consider that when a new CPU/system comes out, they don't want to look bad compared to others, and it's a great time to add new features to show off the new CPU, assuming it gives any added power.  Short of them coding to the lowest-level CPU instructions for multimedia encoding/decoding or the like, assembly language is a major waste of developer time: incredibly few applications will get anything resembling the time/effort/cost back by rewriting anything in assembly language, because compilers more often than not do a better job with optimizations these days than the majority of the best hand-coded assembly language implementations.

    When you talk of developers needing to rewrite their applications because of a new CPU, even a new family/ISA where all that's changed is the CPU architecture? NOBODY REWRITES ANY MAJOR CODE FOR THAT!  That's insanely stupid for many reasons (including economically) and plain WRONG.  No major operating system is unable to be readily  rebuilt for other CPUs with only a very tiny amount of low-level assembly, and 99.99%  (admittedly, a number that's guessed, but no more than your no-proof data) of user applications in the last 10 years won't have any hand-coded assembly, regardless of their size. 

    The lowest-level language you'll find used in that percentage of apps these days is C.  While you can write it in a not-fully-portable manner because the C/C++ standards leave some details to the implementation of the vendors, it's actually very straightforward, even if you do convert between CPU architectures, to make the required changes to have the code work as it did before: it's not rocket science, it's not even interesting computer science.  If they do find a need to fix it to be portable, then they should ensure they never need to fix that again.

    For most applications, it truly is as simple as flipping an Xcode selection, as long as you've written in a reasonably portable way.  It's not hard in Objective-C/C/C++ to do so, and Swift makes it easier.  Switching from 32-bits to 64-bits had the biggest change with Apple changing from regular floats to double floats for CGFloats, but that's not nearly the hardest thing to fix.  You don't know what you're talking about.

    Yes, it took Apple a YEAR to bring Final Cut over. More than a half year to bring LOGIC pro over. It took both Adobe and Microsoft a year as well. It took WolframAlpha (the company doing the demo) about 7 months to move over. There are numerous other examples.

    of course, by your thinking, they are all incompetent.
    You realise this isn’t the same thing, right? In the os 9 to OS X days devs had to change the api they were calling (a process called carbonisation) from the old OS 9 code to a subset of this api, which for some devs was difficult because they used some low level and often dangerous features that were removed.

    in the move to 64 bit some lower level c type structures had to be changed. That’s done now. 

     To move to arm today, if the developers are using the Apple compilers, there should be no change in the api. No change in the api means no work. 
    Oh my lord. It’s amazing how dreamy eyed some people are. It’s always work.
    Well yes. There’s a switch in Xcode so theres that work for apps not already uploaded as bitcode. 

    This thread shows a major problem with the world; opinions presented as facts.  The idea that every opinion is equally worthwhile. 

    Please show your workings out. What exact api do you expect developers will have to change when building in Xcode? I’m saying the compiler will do the work, or Apple will have failed. 

    So what APIs won’t port? 
    It’s not API’s as much as it is basic chip instructions. ARM lacks a number of x86 instructions that will need to be emulated in software, just as we have found in the past with other changeovers. This is nothing new.
  • Reply 133 of 149
    thttht Posts: 5,443member
    tht said:
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    They should have.
    Once Google released Docs, making it free for consumers, and whatever it costs for a business license for their suite of office automation stuff, the room for iWork, or any other office suite other than MS Office which is the “standard” for most large businesses and schools, there was no room for iWork to be anything other than a free office suite of apps for Apple customers. Keynote is the best presentation software though.

    I don’t think there was a real window to begin with. Petabytes, whatever absurdly high amount, of files is in MS Office formats. That legacy drives Office usage in perpetuity. Not sure how people get off that train other than for MS to go out of business for other reasons. Making iWork popular would be a pretty difficult if not impossible road to take. If Google decides to stop doing GSuite stuff, that would open a window too, but it isn’t going to happen unless something bad happens to Google.

    Apple must get MS to port Office to macOS/ARM. Office is the standard office automation tool for large businesses. They won’t be selling ARM Macs to these places without it because all those old files are in Office file formats.

    It’s the same story for Adobe. An absurdly high amount of storage is in Adobe app formats. Compatibility is hugely important, and it is imperative for them to get Adobe to port as fast as possible.
    Microsoft has always made money on Office for Mac, so no persuasion required beyond Microsoft looking out for Microsoft.

    ”Porting” will be as easy as clicking a checkbox to also build for ARM (Apple will likely require fat binaries anyway) building it, run a BVT and shipping it. Trivial.  Microsoft has experience making code cross-CPU portable by default so unless Apple does something stupid by changing behaviors of APIs from Intel MacOS this is less involved than investigating a crash bug in Word if they get one.
    It’s a an Xcode target switch for apps that uses Apple’s frameworks, and platform neutral code that don’t directly deal with hardware. For Microsoft, Adobe, et al, it’s not that trivial. Their app code bases are based on their own custom libraries or frameworks, with some code that is decades old. These types of app architectures inevitably have ISA level optimizations and assumptions that mean some significant tweaking of the code-base is necessary when moving to another ISA. It’s very far away from an application re-write, and it will be easier to move Office for macOS from x86 to ARM, but it’s work. Then, these are large apps with very large testing and quality assurance processes. It will take them lots of months to a year. 

    If it was trivial, CPU architecture changes would be much more common. Like, full Office being available on Windows/ARM machines. MS isn’t doing binary translation and emulation of various parts of the Office/ARM version because that’s really what they want. They do it that way because the Office platform code base has a lot of x86 assumptions in them, and they don’t want to do the work to either make the code fully portable or to optimize their code for ARM. This is software in MS’s entire control. They control the entire stack, yet, still don’t provide full Office on Windows/ARM. It will be another layer removed for macOS/ARM and Office. Either that, or we live with the neutered versions of Office that is much more portable, like UWP Office or the backend of iOS Office for awhile.

    The only way Apple gets ahead of this is for them to give them alphas and betas of macOS/ARM months ahead of the public announcement for macOS/ARM. They really should be handing out modified, unlocked Apple TV 4K pucks with USBC, with at least the A10X, like candy. Then, they need actual commitment from MS and Adobe, which waxes and wanes for whatever reason.
  • Reply 134 of 149
    asdasdasdasd Posts: 5,686member
    melgross said:
    asdasd said:
    melgross said:
    asdasd said:
    melgross said:
    melgross said:
    asdasd said:
    melgross said:
    lkrupp said:
    Any ideas on how Apple will handle the X86 code of current apps to run on ARM architecture? I am not educated on this. Is ARM close enough to X86 that the transition will be easy or will it require a Rosetta-like translation framework like the move from Moto 68000 to X86 did. Will we have universal binaries again or something else during the transition?
    This is the problem I’ve been wondering about for some time. While some people dismiss this as an issue, or in most cases, don’t even think about it (aren’t aware it is an issue), it’s the biggest issue apple will need to deal with. In previous changeovers, even Apple was very lax in getting their own big apps out. It took a year for them. It took a long time for Adobe and Microsoft, with their massive software, to come over too.

    ARM is not close to x86. It’s optimized for battery life over performance. Apple and ARM have made significant advances on that front, but the instruction sets are different enough. We know from previous attempts at emulation, that a processor family needs to be 5 times as powerful in order to be able to run software at the same speed as the family they’re emulating. This hasn’t changed. Microsoft supposedly does it now, with their “universal” sdk. But they don’t, really. They require software to be rewritten, and recompiled for ARM. And there have still been issues with performance, specific features and bugs.

    im not saying it can’t be done, because obviously it can. But if Apple is really going to release a device next year, there will either be significant limitations, or they’ve figured out a way around them. My suggestion, which no one here has ever commented on, from my memory, is to add a dozen x86 instructions to the chip. It’s been found that 80% of the slowdown between chip families is from about a dozen instructions. The chip, or OS, could hand that over to those when native x86 software needs them. Individual instructions aren’t patented, or copyrighted, as far as I know. If true, that would give Apple a way around the problem.
    I can’t emphasise enough that the vast majority of apps produced for the Mac right now will be a recompile and some won’t even need that. The compiler is doing the work if you use the Apple tool chain. 

    Dont confuse the compiled machine code with the higher level frameworks that might be used. 
    You shouldn’t, because it’s a myth. Yes, small apps can be recompiled, and will often work without more revision other than to fix bugs that always creep in when recompiling. But anything else needs to be rewritten.  It seems that people here forget the announcements that some developers have made during these transition periods in Apple demonstrations. They come out and announce how easy it was to get their massive app up and running in just a weekend. But then, it actually takes six months, or more, before that app is released. Why, because for the demo, they showed a few chosen features that worked out, after frenzied work. But the rest needed a good deal of work to function properly.

    its incrediably naive to think that this will be easy.
    I laugh when I read you making these insane claims.  If it takes a company 6 months to move only from one CPU to another CPU, where the APIs are identical, and they're not adding features/rewriting their applications in any meaningful way, then they're grossly incompetent.  Consider that when a new CPU/system comes out, they don't want to look bad compared to others, and it's a great time to add new features to show off the new CPU, assuming it gives any added power.  Short of them coding to the lowest-level CPU instructions for multimedia encoding/decoding or the like, assembly language is a major waste of developer time: incredibly few applications will get anything resembling the time/effort/cost back by rewriting anything in assembly language, because compilers more often than not do a better job with optimizations these days than the majority of the best hand-coded assembly language implementations.

    When you talk of developers needing to rewrite their applications because of a new CPU, even a new family/ISA where all that's changed is the CPU architecture? NOBODY REWRITES ANY MAJOR CODE FOR THAT!  That's insanely stupid for many reasons (including economically) and plain WRONG.  No major operating system is unable to be readily  rebuilt for other CPUs with only a very tiny amount of low-level assembly, and 99.99%  (admittedly, a number that's guessed, but no more than your no-proof data) of user applications in the last 10 years won't have any hand-coded assembly, regardless of their size. 

    The lowest-level language you'll find used in that percentage of apps these days is C.  While you can write it in a not-fully-portable manner because the C/C++ standards leave some details to the implementation of the vendors, it's actually very straightforward, even if you do convert between CPU architectures, to make the required changes to have the code work as it did before: it's not rocket science, it's not even interesting computer science.  If they do find a need to fix it to be portable, then they should ensure they never need to fix that again.

    For most applications, it truly is as simple as flipping an Xcode selection, as long as you've written in a reasonably portable way.  It's not hard in Objective-C/C/C++ to do so, and Swift makes it easier.  Switching from 32-bits to 64-bits had the biggest change with Apple changing from regular floats to double floats for CGFloats, but that's not nearly the hardest thing to fix.  You don't know what you're talking about.

    Yes, it took Apple a YEAR to bring Final Cut over. More than a half year to bring LOGIC pro over. It took both Adobe and Microsoft a year as well. It took WolframAlpha (the company doing the demo) about 7 months to move over. There are numerous other examples.

    of course, by your thinking, they are all incompetent.
    You realise this isn’t the same thing, right? In the os 9 to OS X days devs had to change the api they were calling (a process called carbonisation) from the old OS 9 code to a subset of this api, which for some devs was difficult because they used some low level and often dangerous features that were removed.

    in the move to 64 bit some lower level c type structures had to be changed. That’s done now. 

     To move to arm today, if the developers are using the Apple compilers, there should be no change in the api. No change in the api means no work. 
    Oh my lord. It’s amazing how dreamy eyed some people are. It’s always work.
    Well yes. There’s a switch in Xcode so theres that work for apps not already uploaded as bitcode. 

    This thread shows a major problem with the world; opinions presented as facts.  The idea that every opinion is equally worthwhile. 

    Please show your workings out. What exact api do you expect developers will have to change when building in Xcode? I’m saying the compiler will do the work, or Apple will have failed. 

    So what APIs won’t port? 
    It’s not API’s as much as it is basic chip instructions. ARM lacks a number of x86 instructions that will need to be emulated in software, just as we have found in the past with other changeovers. This is nothing new.
    why would an application compiled to ARM need i86 instructions? And why do you think higher level developers would need to care, anyway, 
  • Reply 135 of 149
    asdasdasdasd Posts: 5,686member

    tht said:
    tht said:
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    They should have.
    Once Google released Docs, making it free for consumers, and whatever it costs for a business license for their suite of office automation stuff, the room for iWork, or any other office suite other than MS Office which is the “standard” for most large businesses and schools, there was no room for iWork to be anything other than a free office suite of apps for Apple customers. Keynote is the best presentation software though.

    I don’t think there was a real window to begin with. Petabytes, whatever absurdly high amount, of files is in MS Office formats. That legacy drives Office usage in perpetuity. Not sure how people get off that train other than for MS to go out of business for other reasons. Making iWork popular would be a pretty difficult if not impossible road to take. If Google decides to stop doing GSuite stuff, that would open a window too, but it isn’t going to happen unless something bad happens to Google.

    Apple must get MS to port Office to macOS/ARM. Office is the standard office automation tool for large businesses. They won’t be selling ARM Macs to these places without it because all those old files are in Office file formats.

    It’s the same story for Adobe. An absurdly high amount of storage is in Adobe app formats. Compatibility is hugely important, and it is imperative for them to get Adobe to port as fast as possible.
    Microsoft has always made money on Office for Mac, so no persuasion required beyond Microsoft looking out for Microsoft.

    ”Porting” will be as easy as clicking a checkbox to also build for ARM (Apple will likely require fat binaries anyway) building it, run a BVT and shipping it. Trivial.  Microsoft has experience making code cross-CPU portable by default so unless Apple does something stupid by changing behaviors of APIs from Intel MacOS this is less involved than investigating a crash bug in Word if they get one.
    It’s a an Xcode target switch for apps that uses Apple’s frameworks, and platform neutral code that don’t directly deal with hardware. For Microsoft, Adobe, et al, it’s not that trivial. Their app code bases are based on their own custom libraries or frameworks, with some code that is decades old. These types of app architectures inevitably have ISA level optimizations and assumptions that mean some significant tweaking of the code-base is necessary when moving to another ISA. It’s very far away from an application re-write, and it will be easier to move Office for macOS from x86 to ARM, but it’s work. Then, these are large apps with very large testing and quality assurance processes. It will take them lots of months to a year. 

    If it was trivial, CPU architecture changes would be much more common. Like, full Office being available on Windows/ARM machines. MS isn’t doing binary translation and emulation of various parts of the Office/ARM version because that’s really what they want. They do it that way because the Office platform code base has a lot of x86 assumptions in them, and they don’t want to do the work to either make the code fully portable or to optimize their code for ARM. This is software in MS’s entire control. They control the entire stack, yet, still don’t provide full Office on Windows/ARM. It will be another layer removed for macOS/ARM and Office. Either that, or we live with the neutered versions of Office that is much more portable, like UWP Office or the backend of iOS Office for awhile.

    The only way Apple gets ahead of this is for them to give them alphas and betas of macOS/ARM months ahead of the public announcement for macOS/ARM. They really should be handing out modified, unlocked Apple TV 4K pucks with USBC, with at least the A10X, like candy. Then, they need actual commitment from MS and Adobe, which waxes and wanes for whatever reason.
    Your claim about Microsoft is almost certainly unfounded, if they had decades old code it would have failed already on the transition to OS X and then 64bit. If it were emulating i86 how would that have even worked on power pc?

    Wiki says its a mixture of Objective C, front end, and C++ backend ( they mean internal libraries) which is eminently portable.


  • Reply 136 of 149
    tht said:
    tht said:
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    They should have.
    Once Google released Docs, making it free for consumers, and whatever it costs for a business license for their suite of office automation stuff, the room for iWork, or any other office suite other than MS Office which is the “standard” for most large businesses and schools, there was no room for iWork to be anything other than a free office suite of apps for Apple customers. Keynote is the best presentation software though.

    I don’t think there was a real window to begin with. Petabytes, whatever absurdly high amount, of files is in MS Office formats. That legacy drives Office usage in perpetuity. Not sure how people get off that train other than for MS to go out of business for other reasons. Making iWork popular would be a pretty difficult if not impossible road to take. If Google decides to stop doing GSuite stuff, that would open a window too, but it isn’t going to happen unless something bad happens to Google.

    Apple must get MS to port Office to macOS/ARM. Office is the standard office automation tool for large businesses. They won’t be selling ARM Macs to these places without it because all those old files are in Office file formats.

    It’s the same story for Adobe. An absurdly high amount of storage is in Adobe app formats. Compatibility is hugely important, and it is imperative for them to get Adobe to port as fast as possible.
    Microsoft has always made money on Office for Mac, so no persuasion required beyond Microsoft looking out for Microsoft.

    ”Porting” will be as easy as clicking a checkbox to also build for ARM (Apple will likely require fat binaries anyway) building it, run a BVT and shipping it. Trivial.  Microsoft has experience making code cross-CPU portable by default so unless Apple does something stupid by changing behaviors of APIs from Intel MacOS this is less involved than investigating a crash bug in Word if they get one.
    It’s a an Xcode target switch for apps that uses Apple’s frameworks, and platform neutral code that don’t directly deal with hardware. For Microsoft, Adobe, et al, it’s not that trivial. Their app code bases are based on their own custom libraries or frameworks, with some code that is decades old. These types of app architectures inevitably have ISA level optimizations and assumptions that mean some significant tweaking of the code-base is necessary when moving to another ISA. It’s very far away from an application re-write, and it will be easier to move Office for macOS from x86 to ARM, but it’s work. Then, these are large apps with very large testing and quality assurance processes. It will take them lots of months to a year. 

    If it was trivial, CPU architecture changes would be much more common. Like, full Office being available on Windows/ARM machines. MS isn’t doing binary translation and emulation of various parts of the Office/ARM version because that’s really what they want. They do it that way because the Office platform code base has a lot of x86 assumptions in them, and they don’t want to do the work to either make the code fully portable or to optimize their code for ARM. This is software in MS’s entire control. They control the entire stack, yet, still don’t provide full Office on Windows/ARM. It will be another layer removed for macOS/ARM and Office. Either that, or we live with the neutered versions of Office that is much more portable, like UWP Office or the backend of iOS Office for awhile.

    The only way Apple gets ahead of this is for them to give them alphas and betas of macOS/ARM months ahead of the public announcement for macOS/ARM. They really should be handing out modified, unlocked Apple TV 4K pucks with USBC, with at least the A10X, like candy. Then, they need actual commitment from MS and Adobe, which waxes and wanes for whatever reason.
    I won’t speak for Adobe, but I personally know of someone in the Mac Office team, and he’s a coworker.  There’s nothing that is ISA-specific in Microsoft Office, as there’s absolutely nothing that ISA-specific instructions could be used to optimize anything using hand-written assembly where the gains are measurable.  The majority of the time for Office execution is spent waiting on incredibly slow humans, and nothing is real-time enough to worry about it.

    Word as an example was fast enough on computers 20 years ago that it’d do autocorrection faster than you could type on the hardware from then, and Apple hasn’t sold an ARM ISA chip that slow in the last decade, so I can safely say you don’t know what you’re talking about.
  • Reply 137 of 149
    thttht Posts: 5,443member
    tht said:
    tht said:
    melgross said:
    I expect Apple has pretty solid data about what applications Mac users actually use.  I'm sure a decent chunk of the market (such as my wife) use no applications beyond a Web browser, the applications Apple provides, and Word, PowerPoint, and Excel.  And push comes to shove, many people in this category could do without the last 3.

    One approach for Apple would be to target this market with a high-quality, low-price, super portable MacOS laptop (that just happens to have very little third party support initially).  But is there enough of a market for this since the same audience can do just fine with an iPad and external keyboard.
    Office is extremely important on the Mac. It’s so important, that Apple never tried to have iWork rival it. They just gave up. We had hoped Apple would enhance it to become a real rival, which they could have, but they didn’t.

    in fact, in the late 1990’s, when Apple found Microsoft had their hand in the QuickTime cookie jar, one of the requirements Apple made to Microsoft was that Microsoft upgrade Office for the Mac for at least 5 years.
    They should have.
    Once Google released Docs, making it free for consumers, and whatever it costs for a business license for their suite of office automation stuff, the room for iWork, or any other office suite other than MS Office which is the “standard” for most large businesses and schools, there was no room for iWork to be anything other than a free office suite of apps for Apple customers. Keynote is the best presentation software though.

    I don’t think there was a real window to begin with. Petabytes, whatever absurdly high amount, of files is in MS Office formats. That legacy drives Office usage in perpetuity. Not sure how people get off that train other than for MS to go out of business for other reasons. Making iWork popular would be a pretty difficult if not impossible road to take. If Google decides to stop doing GSuite stuff, that would open a window too, but it isn’t going to happen unless something bad happens to Google.

    Apple must get MS to port Office to macOS/ARM. Office is the standard office automation tool for large businesses. They won’t be selling ARM Macs to these places without it because all those old files are in Office file formats.

    It’s the same story for Adobe. An absurdly high amount of storage is in Adobe app formats. Compatibility is hugely important, and it is imperative for them to get Adobe to port as fast as possible.
    Microsoft has always made money on Office for Mac, so no persuasion required beyond Microsoft looking out for Microsoft.

    ”Porting” will be as easy as clicking a checkbox to also build for ARM (Apple will likely require fat binaries anyway) building it, run a BVT and shipping it. Trivial.  Microsoft has experience making code cross-CPU portable by default so unless Apple does something stupid by changing behaviors of APIs from Intel MacOS this is less involved than investigating a crash bug in Word if they get one.
    It’s a an Xcode target switch for apps that uses Apple’s frameworks, and platform neutral code that don’t directly deal with hardware. For Microsoft, Adobe, et al, it’s not that trivial. Their app code bases are based on their own custom libraries or frameworks, with some code that is decades old. These types of app architectures inevitably have ISA level optimizations and assumptions that mean some significant tweaking of the code-base is necessary when moving to another ISA. It’s very far away from an application re-write, and it will be easier to move Office for macOS from x86 to ARM, but it’s work. Then, these are large apps with very large testing and quality assurance processes. It will take them lots of months to a year. 

    If it was trivial, CPU architecture changes would be much more common. Like, full Office being available on Windows/ARM machines. MS isn’t doing binary translation and emulation of various parts of the Office/ARM version because that’s really what they want. They do it that way because the Office platform code base has a lot of x86 assumptions in them, and they don’t want to do the work to either make the code fully portable or to optimize their code for ARM. This is software in MS’s entire control. They control the entire stack, yet, still don’t provide full Office on Windows/ARM. It will be another layer removed for macOS/ARM and Office. Either that, or we live with the neutered versions of Office that is much more portable, like UWP Office or the backend of iOS Office for awhile.

    The only way Apple gets ahead of this is for them to give them alphas and betas of macOS/ARM months ahead of the public announcement for macOS/ARM. They really should be handing out modified, unlocked Apple TV 4K pucks with USBC, with at least the A10X, like candy. Then, they need actual commitment from MS and Adobe, which waxes and wanes for whatever reason.
    I won’t speak for Adobe, but I personally know of someone in the Mac Office team, and he’s a coworker.  There’s nothing that is ISA-specific in Microsoft Office, as there’s absolutely nothing that ISA-specific instructions could be used to optimize anything using hand-written assembly where the gains are measurable.  The majority of the time for Office execution is spent waiting on incredibly slow humans, and nothing is real-time enough to worry about it.

    Word as an example was fast enough on computers 20 years ago that it’d do autocorrection faster than you could type on the hardware from then, and Apple hasn’t sold an ARM ISA chip that slow in the last decade, so I can safely say you don’t know what you’re talking about.
    Can you ask him why full Office is not available for Windows/ARM?
  • Reply 138 of 149
    mattinozmattinoz Posts: 2,316member
    I know office is an important app to many people but it already has iPad version app so it already runs on Apple ARM. So it seems odd concentrate discussion on it given we know it work on whatever Apple are doing.

    The interesting things are the productive apps that people lease each year for many thousands of dollars. If Apple scares those markets away they will have issues.
  • Reply 139 of 149
    mattinoz said:
    I know office is an important app to many people but it already has iPad version app so it already runs on Apple ARM. So it seems odd concentrate discussion on it given we know it work on whatever Apple are doing.

    The interesting things are the productive apps that people lease each year for many thousands of dollars. If Apple scares those markets away they will have issues.
    The iPad version of Office pales in comparison, functionality wise, to the Mac version of Office.
  • Reply 140 of 149
    ...yet more proprietary lockdown upgrade fatigue...?
    What’s the matter? Upset Apple didn’t deign to go tacky with RGB all over the place?
Sign In or Register to comment.