Leaked plan shows Intel will try to be more efficient than M1 Max by late 2023

2»

Comments

  • Reply 21 of 37
    Hey Intel, why don't you catch up with AMD first
    muthuk_vanalingamMplsPwilliamlondonwatto_cobrakillroy
  • Reply 22 of 37
    Intel is a dysfunctional company.  
    williamlondonwatto_cobra
  • Reply 23 of 37
    MplsPMplsP Posts: 4,038member
    stevegee said:
    man, i can feel the butthurt at Intel from here…and despite all their efforts i don’t expect great things from them. 
    Yup. It’s like the comment their CEO said a year ago or so saying they will create best in class processors that essentially implied they weren’t really trying before. I love how intel was perfectly happy to create horribly inefficient processors until Apple came along and showed them what was possible. Now suddenly they’re claiming how they’ll be better…in 3 years. 
    williamlondonwatto_cobraravnorodom
  • Reply 24 of 37
    Alder lake already beats m1 max at many tasks.
    On what tasks? Alder lake does ok a lot of benchmarks (if you ignore power consumption). But on real world tasks M1 is usually significantly faster, not because of math performance, but fundamentals like memory latency and bandwidth.

    The power thing is critical - Apple is holding back by making sure the CPU/GPU rarely draw a significant amount of power. On benchmarks like Cinebench where Alder Lake just barely beats the M1 Pro Max, it's only drawing 11 Watts according to Anandtech. Which means it run the benchmark for about *9 hours* on battery. I'm not sure what an Alder Lake CPU draws, but I wouldn't be surprised if it's 10x more, maybe even starting at 20x more before thermal throttling kicks in.

    The thing people defending Intel forget is Apple is perfectly capable of selling a Mac that lasts 2 hours under heavy load. They've been selling those Macs for years with Intel chips. Because Apple has decided to stop doing that, comparisons aren't really fair.

    My prediction is when Apple starts shipping proper desktops on Apple Silicon, ones with a 980W power supply, then we'll really find out what Apple's chipset design team can do. And I don't think it's going to be pretty for Intel.
    edited February 2022 muthuk_vanalingamwilliamlondonwatto_cobra
  • Reply 25 of 37
    wood1208 said:
    Intel is gunning at the wrong target. Because Apple is and will be out of Intel's range, far ahead. What Intel needs to fear is loosing market share to AMD, NVIDIA than

    Apple. Moreover, Apple is in it's own league with huge financial and tech resources to stay ahead Intel or others in chip area.
    Apple is showing the way forward.  To compete with Apple will naturally provide the advantage over AMD, etc.  That is, assuming AMD and others don't also follow Apple's direction.

    So, in other words, Intel is admitting they're an easy 2 years behind Apple in chip development. Because, you know, Apple's 2023 M-series chips will be... more powerful and efficient than the current crop...

    Where was Intel's motivation when Apple was a dedicated customer all of those years? How long ago did the rumors of Apple's plans to make computer SOCs start flying? Why didn't Intel attempt to get into gear then? They had plenty of time to attempt to retain Apple as a customer, but kept dragging their feet. This is starting to sound a little like the Blackberry story, or was it called the Blueberry? Those are such a distant memory anymore. 
    Yeah, there are things Intel can do to help narrow the gap.  They can work with TSMC to get on their best process as well.  The problem is, we don't know that Intel is committed to the SoC path that relies on shared / unified memory, etc.  That would be a drastic departure for Intel and their existing customers expectations.  Likewise, I really doubt Intel is committed to going all in on such solutions.

    To your point, yeah, where was Intel when Apple was their customer?  They were complacent and stagnant.  They desperately needed a swift kick in the rear to wake up and try to be competitive once again.

    My take: Arrow lake-p seems to be designed to match a similar performance per watt as apple’s m series chips. Arrow lake-h seems to be designed to leave apple in the dust. But only time will tell. The benefactor of this competition will be us, the consumer. 
    Anyone can make plans and roadmaps.  Successfully executing on those plans is another thing.  Intel has not displayed much competence in their execution in recent years.  They've fallen behind with their manufacturing process.  Their only hope is to partner with TSMC as Apple has to get on their advanced nodes.  That alone won't be enough though.  x86/x64 ISA will never be as efficient as more modern RISC based designs.  Yes, Intel converts to microcode now and leverages RISC techniques, but their decoder will always be handicapped compared to ARM and others due to uneven CISC instruction sizes.  Finally, unless Intel commits to fully leveraging shared / unified memory, they will always lag in terms of efficiency.  Right now, Intel is NOT competitive with Apple.  Sure, they've increased their single core performance to be ahead of M1, but at the cost of extreme power and heat.  No Intel based laptop is competitive with an Apple MacBook Pro at this point.  They can match performance when plugged in, but when you go unplugged, performance drops in half.  As for the desktop, Apple hasn't played their Mac Pro desktop performance card yet.  Expect dual and quad M1 Max configurations which will put Intel at a disadvantage for desktops as well.
    edited February 2022 watto_cobra
  • Reply 26 of 37
    It’s early enough in the M series that Intel can compete. 

    Though benchmarks inflate the performance of I tekd alder lake artificially, the performance lead is still there. But the efficiency lead is definitely not. 

    Apple has taught the industry, AMD included, how to architect a chip that has the best of both worlds in terms of power and efficiency. 

    Intel has done what they can with that on x86 architecture. Not a ton of directions to go from alder lake. 

    Meanwhile, Apple has grown the M series into a monster in the very first generation. 

    Second generation isn’t even here yet. 

    Something Intel may not be considering is that Apple did not close the books when the M Max launched. That was just getting started in the performance arena. Meanwhile Intel is redlining with molten Alder Lake.

    and Apple already has third gen M series foundations being formulated this year. Intel can come out with whatever they want. They’ll be beaten. And who’s to say Intel can keep up with its own dreams? It’s not been a staple of their brand. 

    Beyond that, let’s be real. The most impressive thing about alder lake is not even Intels CPU. It’s Nvidia's GPU. AND Apple is fighting a war on two fronts with CPU and GPU. It’s winning the CPU war for the most part. Won’t be too long before the GPU war swings Apples way either.  

    the other thing to consider is that Apple is free to radically change its architectures at the flip of a coin and still have everything run smoothly. Nobody else can do that. 

    But they won’t have to. They’ve started on the right foundation snd the sky is the limit. Actually, there are no limits. 

    Intel wants the cachet of being the top silicon provider again. But those days are ending. Looking to the past, it would have been unthinkable - but it’s Apples era now. Not only in software and industrial design, but in the silicon that makes everything tick. 

    It’s funny how we’ve become used to Apple being successful now. But putting ourselves back 5-10 years, this would seem like pie in the sky. And now it’s reality. 

    Wow. 
    edited February 2022 watto_cobra
  • Reply 27 of 37
    Don't count Intel out yet. Yes they have been a complete mess for most of the past decade but they are making interesting noises now. They have new management and they have the funding to make it happen. Unlike Apple/NVIDIA/Microsoft/etc. Intel has its own chip fabrication plants. That means that if Intel is able to produce even partially competitive CPUs/GPUs it will be like pouring water onto parched land. Apple should be investing a hundred billion dollars on custom fabrication unless it wants to vanish if something happens to critical chip production it does not control due to a regional conflict.
  • Reply 28 of 37
    pascal007 said:
    It took Apple's departure to wake Intel up. All the years before, Intel lazily stood still.
    True. 

    Reminiscent of the old battle between 3DFX and Nvidia. 

    3DFX was on top, with Nvidia nipping at their heels. 3DFX was usually less efficient and NVIDIA was usually the more efficient, but only slightly behind in performance. 3DFX got comfortable “one day” and Nvidia eclipsed them. 

    What did 3DFX do? They pulled an Alder Lake and mashed a bunch of cores onto a graphics board and required multiples power supplies. This allowed them to recapture the performance lead once again, but at the cost of a messy, ridiculously power hungry setup. 

    Then Nvidia once again took the crown and 3DFX had to find another way - two graphics boards with multiple cores and operating in SLI mode - an inefficient way to combine the power of multiple graphics boards. 

    But that was something the competition could do too. And with Nvidia deciding they could use their superior tech and also Apple 3DFX’s brute force approach. 3DFX was eventually curbstomped (and embarrassingly acquired buy their competitor) and  relegated to the dustbin of history. 

    Those who have not learned from history are destined to repeat it. Fortunately, Intel seems to be trying to learn, but this playbook between Intel vs Apple sure looks very similar to the way things went with 3DFX and Nvidia. 
    watto_cobra
  • Reply 29 of 37
    Beyond that, let’s be real. The most impressive thing about alder lake is not even Intels CPU. It’s Nvidia's GPU. AND Apple is fighting a war on two fronts with CPU and GPU. It’s winning the CPU war for the most part. Won’t be too long before the GPU war swings Apples way either.  
    The two are not directly related.  Nvidia just happens to make the best video cards suitable for Intel based PCs.  There also isn't much of a battle with the CPU.  CPUs are inherently scalable in performance.  If either Apple or nVidia wants to scale up performance, they can do so by adding more cores.  No new technology is really needed for that.  The big difference is that Apple is using shared / unified memory so it doesn't have to waste energy or performance by sending data from the CPU memory to the GPU memory and back again. 

    Intel is building their own GPUs as well.  There is no technical reason Intel couldn't wholly adopt Apple's SoC approach and try to compete on that level.  

    9secondkox2 said:
    3DFX was on top, with Nvidia nipping at their heels. 3DFX was usually less efficient and NVIDIA was usually the more efficient, but only slightly behind in performance. 3DFX got comfortable “one day” and Nvidia eclipsed them. 

    What did 3DFX do? They pulled an Alder Lake and mashed a bunch of cores onto a graphics board and required multiples power supplies. This allowed them to recapture the performance lead once again, but at the cost of a messy, ridiculously power hungry setup. 
    The 3DFX history was interesting for different reasons.  At the time, you mostly only had 2D video card acceleration.  3DFX came along with an additional card that was 3D only.  Your 2D card would output to the 3DFX card and the from there it would connect to the monitor.  3DFX also had the early jump with game developers targeting their APIs specifically.  

    Over time, once all video cards had 3D acceleration and games were written for common APIs such as OpenGL or eventually Direct 3D, they lost their advantage.
    watto_cobra
  • Reply 30 of 37
    y2any2an Posts: 229member
    blastdoor said:
    … There's no way Intel wins on equivalent process because x86 requires much higher clock speeds (and therefore voltage) to beat Apple's core design on single thread performance.
    Lower voltages allow higher clock speeds. 
    watto_cobra
  • Reply 31 of 37
    techconc said:
    Beyond that, let’s be real. The most impressive thing about alder lake is not even Intels CPU. It’s Nvidia's GPU. AND Apple is fighting a war on two fronts with CPU and GPU. It’s winning the CPU war for the most part. Won’t be too long before the GPU war swings Apples way either.  
    The two are not directly related.  Nvidia just happens to make the best video cards suitable for Intel based PCs.  There also isn't much of a battle with the CPU.  CPUs are inherently scalable in performance.  If either Apple or nVidia wants to scale up performance, they can do so by adding more cores.  No new technology is really needed for that.  The big difference is that Apple is using shared / unified memory so it doesn't have to waste energy or performance by sending data from the CPU memory to the GPU memory and back again. 

    Intel is building their own GPUs as well.  There is no technical reason Intel couldn't wholly adopt Apple's SoC approach and try to compete on that level.  

    9secondkox2 said:
    3DFX was on top, with Nvidia nipping at their heels. 3DFX was usually less efficient and NVIDIA was usually the more efficient, but only slightly behind in performance. 3DFX got comfortable “one day” and Nvidia eclipsed them. 

    What did 3DFX do? They pulled an Alder Lake and mashed a bunch of cores onto a graphics board and required multiples power supplies. This allowed them to recapture the performance lead once again, but at the cost of a messy, ridiculously power hungry setup. 
    The 3DFX history was interesting for different reasons.  At the time, you mostly only had 2D video card acceleration.  3DFX came along with an additional card that was 3D only.  Your 2D card would output to the 3DFX card and the from there it would connect to the monitor.  3DFX also had the early jump with game developers targeting their APIs specifically.  

    Over time, once all video cards had 3D acceleration and games were written for common APIs such as OpenGL or eventually Direct 3D, they lost their advantage.
    The cpu and GPU NOT being related was my point. 

    On the 3DFX point, I think I summed it up fairly and accurately. 
    edited February 2022 watto_cobra
  • Reply 32 of 37
    lkrupplkrupp Posts: 10,557member
    Bottom line? Apple will not be going back to Intel no matter what. All the love for Intel being expressed by some will not change a thing. 
    williamlondonwatto_cobra
  • Reply 33 of 37
    y2an said:
    blastdoor said:
    … There's no way Intel wins on equivalent process because x86 requires much higher clock speeds (and therefore voltage) to beat Apple's core design on single thread performance.
    Lower voltages allow higher clock speeds. 
    Show us you know nothing about how electronics work without telling us you know nothing about electronics works.

    Wait, you already did!

    Lower power usage enable pushing up higher clock speeds by not exceeding what can be done without melting: voltage is only one component in that, and isn’t the most important one, total current is.  Current is a result of how many things are being powered during execution.  You could make the voltage much higher and it may be possible to design things such that it still uses less power, which is voltage times current, and in an electronic device that’s not DC, there are many interesting things that affect power usage and efficiency, including voltage required, such as operating frequency.  RC time constants factor heavily into processing speed, as well as any inductive factors.  The faster you want values to become stable with RC time constants, the higher the voltage needs to be to drive it to a stable-enough state, because the C part of that (capacitors, capacitance) take time to charge to a given voltage, as capacitance resists the change in voltage.  Meanwhile, the resistive properties don’t change, resulting in power usage going up much faster, with a lot more of the power dissipated in the form of waste heat.

    As such, the key to being able to run with the least power and thus the least voltage for a given arbitrary clock speed (which varies from one processor to another and isn’t directly comparable) is to keep as little as possible of the circuitry from being powered on and changing state, because changing of state always requires more power by its nature in this situation.
    williamlondon
  • Reply 34 of 37
    One of the most important things to remember is now Apple controls their own processors in the Macintosh which was something they could never do. They always relied on Motorola, IBM and Intel. Now they have the freedom to design their own chips and not have to make severe modifications for heat and power requirements. 

    Intel has its own problems with consumer perceptions that they would be better off using an AMD cpu than Intel. Too many years of celeron and budget Intel cpus have turned off the bargain shoppers. Sure, high end users still use Intel, but if you look at history, all of the high end cpus like Dec Alpha and MIPS and RISC were all replaced eventually by cheaper cpus by Intel.  Now Intel could be a victim of its own playbook. It all depends on what Microsoft and Linux distros do in the future. 

    Once SoftBank sells off ARM, whoever buys it could put a severe hurt on Intel, that is, as long as Microsoft allows it. They were one of the driving forces to propel Intel to the top. They killed off PPC versions of Windows because Intel asked them to back in the 90s. 
    Your extraordinary claim regarding Microsoft killing off the PPC version of Windows at Intel’s request requires extraordinary proof: I call total BS on that claim.

    The ugly truth of the matter is Windows has never flourished commercially on any platform that didn’t have sales numbers for number of computers sold and thus operating systems sales seats.  Only the more recent versions of the client for ARM processors has even enough users to justify the effort, and that’s with Microsoft selling the ARM-based devices to ensure there are enough of them.  Also, don’t forget the version of Xbox that was its own PPC creature.

    The PPC machines with the largest numbers of seats, bar none, have been Macintoshes and their clones, and compared to PC sales of the time, they were a rounding error for units.  Apple didn’t want Windows running on them, either.  The rest of the PPC machines were expensive workstations/servers and more commonly bundled with expensive UNIX variants.  Somewhere in that range in the late 90s and early 2000s was the BeBox variants.  Again, a rounding error of PPC machines existed to run any version of Windows on, and if you don’t have native code to execute on them (again, chicken/egg numbers issue) and either emulation is extremely slow or doesn’t exist, why would you buy them?

    Windows versions didn’t stop being produced due to your unfounded conspiracy theory, but rather fell victim to market forces.
    williamlondon
  • Reply 35 of 37
    Don't count Intel out yet. Yes they have been a complete mess for most of the past decade but they are making interesting noises now. They have new management and they have the funding to make it happen. Unlike Apple/NVIDIA/Microsoft/etc. Intel has its own chip fabrication plants. That means that if Intel is able to produce even partially competitive CPUs/GPUs it will be like pouring water onto parched land. Apple should be investing a hundred billion dollars on custom fabrication unless it wants to vanish if something happens to critical chip production it does not control due to a regional conflict.
    And their own fabs, to say nothing of the new ones they want to milk the taxpayers to build, have been, along with a sorry excuse for engineering management, the principal obstacles to getting commercial yields of any new (smaller) process chips until this past year, after three years of failures in that effort. So it’s no surprise their target for two years from is where Apple + TSMC were a year ago (given the difficulties with bringing new models into production in 2020 - 2021, drought and other issues in Taiwan, and so on). Not exactly Wayne Gretzky hockey.
  • Reply 36 of 37
    thttht Posts: 5,685member
    One of the most important things to remember is now Apple controls their own processors in the Macintosh which was something they could never do. They always relied on Motorola, IBM and Intel. Now they have the freedom to design their own chips and not have to make severe modifications for heat and power requirements. 

    Intel has its own problems with consumer perceptions that they would be better off using an AMD cpu than Intel. Too many years of celeron and budget Intel cpus have turned off the bargain shoppers. Sure, high end users still use Intel, but if you look at history, all of the high end cpus like Dec Alpha and MIPS and RISC were all replaced eventually by cheaper cpus by Intel.  Now Intel could be a victim of its own playbook. It all depends on what Microsoft and Linux distros do in the future. 

    Once SoftBank sells off ARM, whoever buys it could put a severe hurt on Intel, that is, as long as Microsoft allows it. They were one of the driving forces to propel Intel to the top. They killed off PPC versions of Windows because Intel asked them to back in the 90s. 
    Your extraordinary claim regarding Microsoft killing off the PPC version of Windows at Intel’s request requires extraordinary proof: I call total BS on that claim.

    The ugly truth of the matter is Windows has never flourished commercially on any platform that didn’t have sales numbers for number of computers sold and thus operating systems sales seats.  Only the more recent versions of the client for ARM processors has even enough users to justify the effort, and that’s with Microsoft selling the ARM-based devices to ensure there are enough of them.  Also, don’t forget the version of Xbox that was its own PPC creature.

    The PPC machines with the largest numbers of seats, bar none, have been Macintoshes and their clones, and compared to PC sales of the time, they were a rounding error for units.  Apple didn’t want Windows running on them, either.  The rest of the PPC machines were expensive workstations/servers and more commonly bundled with expensive UNIX variants.  Somewhere in that range in the late 90s and early 2000s was the BeBox variants.  Again, a rounding error of PPC machines existed to run any version of Windows on, and if you don’t have native code to execute on them (again, chicken/egg numbers issue) and either emulation is extremely slow or doesn’t exist, why would you buy them?

    Windows versions didn’t stop being produced due to your unfounded conspiracy theory, but rather fell victim to market forces.
    Hmm, I would hazard a guess that PPC XBox 360 and PS3, individually not combined, may have outsold PPC Macs. The PPC Mac era had some really lean years where they only sold 5 to 7 million units per year. Perhaps they are not the "seats" you mean though.

    Anyway, Softbank messed up and bet on the wrong horse. They bought into the Internet of Things, and didn't put enough resources into server, phone, laptop and desktop architectures. Now that we know, and they know, that the Internet of Things is basically shit, they are concentrating more on servers. Losing a design cycle (2 years) or two cycles has got to really hurt though. They effectively lost the cycle that had the highest gear ratio, when TSMC was plowing through 14nm, 10nm and 7nm like clockwork while Intel was stuck on 14nm. If they rode TSMC's fabrication wave, ARM servers and PCs would be much more common now imo. I'd even argue Apple started its ARM transition 2 years too late as well.

    Do agree with you on there not being any conspiracy theory. Nothing touched Intel, or x86, for the basically 30 years. It's not the ISA, it's the fabrication process. Intel had the best fabs in the world for basically 30 years. As long as they had that, they had a huge advantage in terms of perf/$ and production capacity over everyone else. Some chip may outperform them for a period of time, but the fab advantage made it easy for them to catch up and surpass any challengers, and do it cheaper. No competitor could get enough sales to gain a foothold in the market.

    The key part of this article is that Intel considered or is considering using TSMC 3nm in 2024 in order to compete with Apple on perf/W. That's Intel not using their own fabs, but TSMC's. When you hear this, their should some skepticism. Intel has to be desperate enough to front a lot of money to jump the line for TSMC 3nm capacity. They aren't jumping in front of Apple, as Apple likely has most of TSMC 3nm capacity in 2023 and well into 2024, but if Intel gets TSMC 3nm capacity in 2024, that means they've paid to be in front of AMD for both CPUs and GPUs, Qualcomm, Nvidia, and maybe even Apple, plus all the 2nd tier customers. Fronting this money may not make it worth it for Intel, and I'm not sure TSMC would even let them jump the line.
    anonconformist
  • Reply 37 of 37
    tht said:
    One of the most important things to remember is now Apple controls their own processors in the Macintosh which was something they could never do. They always relied on Motorola, IBM and Intel. Now they have the freedom to design their own chips and not have to make severe modifications for heat and power requirements. 

    Intel has its own problems with consumer perceptions that they would be better off using an AMD cpu than Intel. Too many years of celeron and budget Intel cpus have turned off the bargain shoppers. Sure, high end users still use Intel, but if you look at history, all of the high end cpus like Dec Alpha and MIPS and RISC were all replaced eventually by cheaper cpus by Intel.  Now Intel could be a victim of its own playbook. It all depends on what Microsoft and Linux distros do in the future. 

    Once SoftBank sells off ARM, whoever buys it could put a severe hurt on Intel, that is, as long as Microsoft allows it. They were one of the driving forces to propel Intel to the top. They killed off PPC versions of Windows because Intel asked them to back in the 90s. 
    Your extraordinary claim regarding Microsoft killing off the PPC version of Windows at Intel’s request requires extraordinary proof: I call total BS on that claim.

    The ugly truth of the matter is Windows has never flourished commercially on any platform that didn’t have sales numbers for number of computers sold and thus operating systems sales seats.  Only the more recent versions of the client for ARM processors has even enough users to justify the effort, and that’s with Microsoft selling the ARM-based devices to ensure there are enough of them.  Also, don’t forget the version of Xbox that was its own PPC creature.

    The PPC machines with the largest numbers of seats, bar none, have been Macintoshes and their clones, and compared to PC sales of the time, they were a rounding error for units.  Apple didn’t want Windows running on them, either.  The rest of the PPC machines were expensive workstations/servers and more commonly bundled with expensive UNIX variants.  Somewhere in that range in the late 90s and early 2000s was the BeBox variants.  Again, a rounding error of PPC machines existed to run any version of Windows on, and if you don’t have native code to execute on them (again, chicken/egg numbers issue) and either emulation is extremely slow or doesn’t exist, why would you buy them?

    Windows versions didn’t stop being produced due to your unfounded conspiracy theory, but rather fell victim to market forces.
    Hmm, I would hazard a guess that PPC XBox 360 and PS3, individually not combined, may have outsold PPC Macs. The PPC Mac era had some really lean years where they only sold 5 to 7 million units per year. Perhaps they are not the "seats" you mean though.

    Anyway, Softbank messed up and bet on the wrong horse. They bought into the Internet of Things, and didn't put enough resources into server, phone, laptop and desktop architectures. Now that we know, and they know, that the Internet of Things is basically shit, they are concentrating more on servers. Losing a design cycle (2 years) or two cycles has got to really hurt though. They effectively lost the cycle that had the highest gear ratio, when TSMC was plowing through 14nm, 10nm and 7nm like clockwork while Intel was stuck on 14nm. If they rode TSMC's fabrication wave, ARM servers and PCs would be much more common now imo. I'd even argue Apple started its ARM transition 2 years too late as well.

    Do agree with you on there not being any conspiracy theory. Nothing touched Intel, or x86, for the basically 30 years. It's not the ISA, it's the fabrication process. Intel had the best fabs in the world for basically 30 years. As long as they had that, they had a huge advantage in terms of perf/$ and production capacity over everyone else. Some chip may outperform them for a period of time, but the fab advantage made it easy for them to catch up and surpass any challengers, and do it cheaper. No competitor could get enough sales to gain a foothold in the market.

    The key part of this article is that Intel considered or is considering using TSMC 3nm in 2024 in order to compete with Apple on perf/W. That's Intel not using their own fabs, but TSMC's. When you hear this, their should some skepticism. Intel has to be desperate enough to front a lot of money to jump the line for TSMC 3nm capacity. They aren't jumping in front of Apple, as Apple likely has most of TSMC 3nm capacity in 2023 and well into 2024, but if Intel gets TSMC 3nm capacity in 2024, that means they've paid to be in front of AMD for both CPUs and GPUs, Qualcomm, Nvidia, and maybe even Apple, plus all the 2nd tier customers. Fronting this money may not make it worth it for Intel, and I'm not sure TSMC would even let them jump the line.
    Until the Intel-based Macintosh was revealed, I seem to remember Apple never really made the Macintosh specification open enough to be suitably supported for targeting a commercial operating system at it, and even if my memory is incorrect, yes, Apple had a number of lean years, especially before OSX: before then, their OS wasn’t even in the same room with other operating systems that had protected memory and pre-emptive multitasking, regardless of how user-friendly it was.  Combine that with their prices that quickly weren’t competitive with Intel boxes of the era and not as much software and things were dire indeed.  The best and worst thing they did was allow clones: it showed their potential market was notably bigger than they thought it was, but Apple wasn’t filling it at prices people were willing to pay, and the clone makers thrived for the short period of time it existed.  Before that crazy era with a massive proliferation of Macintosh models, I could readily tell someone all the models that had existed and how they fit into the scheme of things, but Apple diluted and tried to create a machine for every niche, often overlapping in unclear ways: they provided too much choice without enough value in making the effort, and I’m suspecting they had horrible economies of scale as a result.

    Now for product naming, while there are a fair number of different SKUs, customers differentiate them more readily by how they vary, and the naming scheme isn’t overly complex.  Apple has largely become the In-N-Out of computers for the masses: BTO from a deliberately clean selection of options that are all easily built from component parts shared amongst all their devices.  This is how you achieve profitable economies of scale that get more profitable with less research and development required for the results, and far simpler inventory and assembly costs.

    And this is something Intel is now suffering from: they’ve got so many variations of their processors that it’s hard to keep track of, and with Apple simplifying to basic cores and added coprocessors that are shared across all their devices, just with iPhones alone they get into industry-wide PC compatible economies of scale, so the slight specialization to level up to laptops, desktops and servers is, relatively speaking, a marginal added cost.  All that, with no need to make variants to hit parts of the market Apple doesn’t care about, too, make Apple a hard target to beat at their own game.

    Another poster mentioned that Apple achieved the performance of the M1 in their first try: I was watching the progress of the A series and noting their performance before Apple announced their transition, observing they were beating Intel processors per clock.  I submit that poster is horribly wrong: the M1 is far from a first generation because of all the A Series processors that they evolved over all those years, where the biggest distinction with the M1 is the built-in GPU cores and a little bit more unique to the Mac line.

    Reportedly, the iPad was developed before the iPhone and was waiting in the wings.  The problems Apple would have had if they’d released it before the iPhone are:
    1. They had no economies of scale for parts they were already using
    2. Like the total market failure of Windows NT for PPC, they had no software that already existed for people to have anything be a remotely viable reason to buy it, due to the chicken/egg problem.
    3. Because of those first 2 problems, it would have needed to be far more expensive to start just to possibly break even, but that’d doom it out of the gate and reinforce the chicken/egg problem, and have the same problem Microsoft has had with their tablets throughout the decades (yes, I used a Windows tablet in 1993, developed a multimedia viewer title for Sumitomo Molding machines repair manual for a clean room environment).

    The iPhone being put out first was the best strategy: the OS and user interface has no external software at the start, but it was a useful device without even adding external software that people could justify having as a phone, a camera and a music player, along with GPS navigation, even if no other applications were ever developed, and it wasn’t an extra imposition to carry.  Once the SDK was released, about 2 years later, the iPad was released, and it was the first touchscreen tablet that had a huge software collection that was mostly optimized for it when it came out: the iPhone was the trailblazer for the iPad’s success.  In the same way, to tie it all together: both the iPad and the iPhone were the trailblazers for the Apple Silicon Mac success and even though I don’t know how many users take advantage of it, you can run native iPhone/iPad software on it without an emulator.  The iPhone/iPad line were testing grounds for their most expensive devices.

    The interesting thing is I suspect Apple will never build their own chip foundry: until we hit the process node limit for size combined with economics, I expect they’ll always allow others to shoulder that burden: Apple went through turbulent times in the past when they actually assembled their computers and had massive swings in demand, and the capital expenses for up-to-date chip foundries make that look cheap by comparison if you can’t keep it busy.  This on general is how the electronics industry has gone, where instead of everyone owning factories to build their own things, they’ve subcontracted out to a few very large factories that build stuff for all the others in the same buildings with the same workers, so even if one seller has low demand, the workers are always busy making for the others, with much less of major swings for layoffs of assembly workers overall.  As such, all the competitors are largely competing on the features they sell, because they can’t make much difference on the assembly labor costs.  The problem Intel has here is they’re now the CPU version of the massive subcontractors absorbing all those costs: they’re commodities.
Sign In or Register to comment.