Apple's in-house chip design is the 'secret weapon' behind industry-beating performance

Posted:
in Future Apple Hardware

Apple executives believe that by designing their own Apple Silicon chips and AI, the company now has a significant advantage over traditional chipmakers that have to cater to a wide range of markets and customers.

Three monitors display a project management tool, a spreadsheet, and a presentation. A laptop shows a video call and document editing application.
Apple is outpacing the rate of overall industry progress with its M-series chips. Image credit: Apple



Apple's Vice President of Mac Product Marketing Tom Boger and Vice President of Platform Architecture Tim Millet talked about the new M4 line of chips used in recent Apple product updates in an interview with The Indian Express. The company believes doing its own chip design gives it "a tremendous strategic advantage, said Millet.

"We are not a merchant silicon company," he added in encapsulating Apple's advantage. "We do not build chips and sell them to other [companies]."

By creating chips custom-built for the devices they will go into, the company avoids compromises in overall performance. Boger added that "no other platform can touch our power performance per watt. That's the tangible benefit to users."

Industry-leading chip innovation



Boger noted the dramatic increases in performance year-over-year as successive generations of Apple Silicon are released, outpacing the progress rate of the rest of the industry. The new M4, Apple says, brings customers "the world's fastest CPU core, delivering the industry's best single-threaded performance."

The two executives say there is more to the success of Apple Silicon than just delivering speed with minimal energy usage. "We take advantage of the three major components -- the architecture, the design, and the process technology," said Miller.

"Our fourth tool, really our secret weapon, I think, is our ability to co-design these amazing chips with the system teams and the product designers as they are imagining possibilities." Miller pointed to the new M4 Mac mini as an example of this.

"The opportunity was for us and for the design team to be able to come together and build this incredible new platform," he told the newspaper. "There is no way that machine could have come to life without that collaboration. And that is really what Apple is all about."

Miller noted that competing chip manufacturers "can't just go to the latest cutting edge technology like the second generation, three nanometer, but we (Apple) benefit from it in a way that we believe it is worth it. It delivers for us and our products and our customers we are trying to leave nothing on the table."

Boger added that it was rare to see the "pace of innovation year after year after year," noting that the first Apple Silicon chip debuted just four years ago. "That is the promise. That is a commitment we make to our teams to deliver innovations as they are available to us," he said.

The rise of the Neural Engine



Commenting on the rise of artificial intelligence in PC s and Apple's response of Apple Intelligence, Boger noted that there have been "intelligent" features in Macs for years. He noted that Apple first included a Neural Engine in its iPhone chip designs in 2017.

Two scientists in lab coats work at a table with a laptop and microscopes, surrounded by lab equipment and glassware.
The Neural Engine and M-series chips attract more intensive workload customers. Image credit: Apple



Millet added that "this was inspired by our recognition of the importance of computational photography. We were seeing the amazing research that folks up in the University of Toronto were demonstrating [that] these new neural networks were capable of doing image recognition beyond the capacity of humans, or at least matching, and they were headed on a trajectory that was clear."

"And so we pounced on the opportunity to build that embedded capability into our camera processors for the phone," Millet said. Boger added that the Neural Engine was a core part of the first M1 chip.

"We have a great architecture for AI, and we also have developers taking advantage of Apple silicon to offer our customers intelligent features," he said. "So the M Series chips were always built for AI."

Boger said that his team saw "an interesting paper" in 2017 that discussed transformer networks, now the engine behind Large Language Models (LLMs) used in AI. Boger's team saw that the technology could have a major impact on the Neural Engine, and introduced them into the first M-series chips.

"It shows you the diligence that we spend all our time trying to figure out where the ball is moving," said Miller. "We try to make sure we are there before it gets there."

Innovation driven by users



Boger said in the interview that Apple Silicon continues to push boundaries in performance and energy efficiency "because that's what our customers do." He used the M4 MacBook Pro line as an example.

"For instance, you run the most demanding workload while you have it plugged in, and then you unplug it [and] it is going to give you the exact same performance."

In noting the addition of the M4 Pro and M4 Max chips, Millet said that the memory bandwidth is a key differentiator from the regular M4. "M4 Max has effectively about twice the memory bandwidth of M4 Pro, [which] will help someone who was really pushing the edge for a very, very large model."

Millet said that Apple works closely with software partners "to look for all the best opportunities to accelerate not just generic benchmarks, which we often get judged by, but more importantly by the workloads that we are actually delivering to our customers."

"We know what the hardware system and thermal design will look like, and we understand what the process technology nodes are, and we aggressively pursue our best silicon options," Millet said. "I have been doing this for more than 30 years [and] it is the best situation to be in."



Read on AppleInsider

ssfe11

Comments

  • Reply 1 of 19
    mpantonempantone Posts: 2,274member
    It's not a "secret weapon." It was crystal clear to the entire semiconductor industry back in 2013 when Apple debuted the A7 SoC, the first 64-bit chip widely distributed in a smartphone.

    It was a mic-drop moment for Apple. The rest of the industry was speechless because 64-bit wasn't expected for a couple more years. Insightful observers theorized that Apple could be on the path to a desktop processor.

    So seven years later when Apple announced Apple Silicon, it wasn't that much of a surprise to the rest of the industry or people who paid attention to such things.

    It has been very clear that control over the hardware, software, and service stack gives Apple an advantage that none of their competitors in various markets has. This is not "secret" or hidden. It has been in plain daylight for well over a decade.

    It's amazing that even in 2024 there are journalists and technologists who still do not get this.

    And remember that consumer technology innovation is driven by smartphones, the primary computing modality of consumers in the present day. It doesn't come from PCs. The A-series SoCs begat the M-series SoCs, not the other way around. And all of the major technology innovations we take for granted today -- wireless communications (including WiFi, Bluetooth, 5G+), NFC contactless payments, biometric identification, location services, digital cameras, computational photography/videography, display panel technology, battery technology, performance+efficiency cores, cloud services, etc. -- all have been driven by smartphones.
    edited November 2024 mainyehcmuthuk_vanalingamssfe11ForumPostsagan_studentMisterKitdewmewatto_cobra
  • Reply 2 of 19
    In just 4 short years, Apple went from crazy upstart coming in and sending a scare into the big cpu guys - to absolutely and unapologetically DOMINATING them. The fact that Apple now has the fastest cpu cores in the world is insane. They are steadily approaching Nvidias best GPUS as well. Before too long, it appears apple will surpass Nvidias next gen efforts. Seriously amazing stuff.
    radarthekatmainyehcssfe11elijahgForumPostMisterKitjeffharrisAlex_Vwatto_cobra
  • Reply 3 of 19
    AppleZuluAppleZulu Posts: 2,248member
    This should be unsurprising, because it’s at the heart of Apple’s business model, and is the same concept as Apple’s exclusive relationship between its hardware and operating systems. Boiled down, it’s about limiting unnecessary variables. 

    Honestly, it’s surprising how few people seem to get that, and even more surprising that no competing company has tried to replicate that approach. 


    danoxssfe11ForumPostjeffharriswilliamlondonwatto_cobra
  • Reply 4 of 19
    mpantonempantone Posts: 2,274member
    AppleZulu said:
    This should be unsurprising, because it’s at the heart of Apple’s business model, and is the same concept as Apple’s exclusive relationship between its hardware and operating systems. Boiled down, it’s about limiting unnecessary variables. 

    Honestly, it’s surprising how few people seem to get that, and even more surprising that no competing company has tried to replicate that approach. 
    Samsung tried and failed, notably on the software side with their Tizen operating system. Of course they design continue smartphones and Exynos SoCs. And yes, they do have some of their own services like the Galaxy Store (smartphone apps).

    Google's business model is to track everything you do and sell that data to advertisers. They would benefit the most by having control over smartphone hardware which is evidenced by their continued work on Pixel smartphones. In fact, recent Pixel models are now using a Google Tensor G4 SoC which is now their fourth generation silicon.

    My guess is that there are a couple of companies who have prototype smartphone SoCs in their labs. I know little about the Chinese smartphone market but most likely there are several companies (like Huawei) working on it.

    It's possible that Amazon is trying to develop their own silicon for handhelds but it may never see the light of day in a shipping device.

    Microsoft completely blew it. They were a dominant smartphone platform before the iPhone emerged on the scene. Microsoft fumbled it all away and now they are destined to stand on the sidelines and wistfully watch others lead the way. Consumer technology innovation hasn't been driven by PCs for well over a decade. Remember that Apple was not designing their own smartphone silicon when the iPhone debuted in 2007.

    Facebook partially blew it. There was a Facebook smartphone but Facebook abandoned further development over a decade ago.

    https://www.cnet.com/tech/mobile/heres-why-the-facebook-phone-flopped/

    And now there is evidence Meta is trying to develop custom SoCs mostly for their cloud servers. But rather than be a major player in the smartphone industry, Meta is basically a few apps/services that are popular in some markets and relatively unpopular in others. There's plenty of evidence that shows that Meta's focus is misdirected. They sunk a lot of money into a VR blackhole and their inane "metaverse" B.S. instead of trying to make a dent in smartphones.
    edited November 2024 elijahgchasmsconosciutoAlex_Vwatto_cobra
  • Reply 5 of 19
    radarthekatradarthekat Posts: 3,904moderator
    mpantone said:
    It's not a "secret weapon." It was crystal clear to the entire semiconductor industry back in 2013 when Apple debuted the A7 SoC, the first 64-bit chip widely distributed in a smartphone.

    It was a mic-drop moment for Apple. The rest of the industry was speechless because 64-bit wasn't expected for a couple more years. Insightful observers theorized that Apple could be on the path to a desktop processor.

    So seven years later when Apple announced Apple Silicon, it wasn't that much of a surprise to the rest of the industry or people who paid attention to such things.

    It has been very clear that control over the hardware, software, and service stack gives Apple an advantage that none of their competitors in various markets has. This is not "secret" or hidden. It has been in plain daylight for well over a decade.

    It's amazing that even in 2024 there are journalists and technologists who still do not get this.

    And remember that consumer technology innovation is driven by smartphones, the primary computing modality of consumers in the present day. It doesn't come from PCs. The A-series SoCs begat the M-series SoCs, not the other way around. And all of the major technology innovations we take for granted today -- wireless communications (including WiFi, Bluetooth, 5G+), NFC contactless payments, biometric identification, location services, digital cameras, computational photography/videography, display panel technology, battery technology, performance+efficiency cores, cloud services, etc. -- all have been driven by smartphones.
    I think most folks these days recognize terms like ‘secret sauce’ and ‘secret weapon’ as figures of speech rather than literal descriptors.   I give the article author a pass.  
    appleinsiderusermainyehckillroyPenziForumPostdewmesconosciutoAlex_Vwatto_cobra
  • Reply 6 of 19
    y2any2an Posts: 231member
    Sorry but this goes back to the PA Semi acquisition in 2008. A clear move to become wholly market leading in chip design. 
    mainyehckillroydanoxchasmMisterKitblastdoorsconosciutoAlex_Vwatto_cobra
  • Reply 7 of 19
    eriamjheriamjh Posts: 1,782member
    I have to say that i iPhone profits made this custom CPU possible.   We all saw it coming with the announcement of the A7, the first mobile 64-bit CPU. (I think).

    We all know it took them until the A12Z to make the Mac Mini with Apple Silicon.   I really wonder how well MacOS ran on the A7, which I'm guessing was the first of the first prototypes for Apple to try it.  
    Alex_Vwatto_cobra
  • Reply 8 of 19
    y2an said:
    Sorry but this goes back to the PA Semi acquisition in 2008. A clear move to become wholly market leading in chip design. 
    It’s almost as if Apple execs always regarded the switch to Intel as a stopgap measure, i.e. being dependent upon Intel wasn’t any better than being dependent on IBM and Motorola, in the grand scheme of things, but it was just technologically expedient for the time being. Steve and his minions, and their successors alike, have always been playing a really long game of 4D chess.
    killroyMisterKitradarthekatAlex_Vwatto_cobra
  • Reply 9 of 19
    dk49dk49 Posts: 287member
    Wasn't that "secret" stolen by ex Apple engineers who started Nuvia (now Qualcomm)? XElite apparently has better performance per watt.
    williamlondon
  • Reply 10 of 19
    danoxdanox Posts: 3,479member
    y2an said:
    Sorry but this goes back to the PA Semi acquisition in 2008. A clear move to become wholly market leading in chip design. 
    The best $1.1 billion dollars Apple spent was acquiring Next Computer (Steve Jobs), PA Semi, Intrinsiy, and Anobit, and that measured frugality with tech acquisitions has paid off over the years for Apple.

    The largest acquisition ever for Apple is still only three billion dollars, over the years there have been many clueless analysts that keep insisting that Apple is gonna buy some giant multi billion dollar company which is what the competition does like Google, Microsoft, and Meta who over the years, have tossed away mind blowing billions of dollars on ill fated acquisitions which usually end up thousands of people being laid off.
    chasmelijahgradarthekatwatto_cobra
  • Reply 11 of 19
    dk49 said:
    Wasn't that "secret" stolen by ex Apple engineers who started Nuvia (now Qualcomm)? XElite apparently has better performance per watt.
    That's what I'm also thinking.  How did the X Elite chip catch up so quickly and has a far more powerful iGPU than what Apple chips have?  I'm sure that extra power must come at some cost in terms of battery life or thermals.  I'm still hoping ARM Holdings can delay the release of those Qualcomm chips due to a licensing dispute but I bet Qualcomm will simply end up paying the licensing fee.  Qualcomm always wins.
    radarthekat
  • Reply 12 of 19
    dewmedewme Posts: 5,824member
    The one thing that Apple has been far ahead of their competitors on is staying true to their convictions, especially around providing an integrated solution and controlling their own destiny. Many of Apple's competitors seem to be always seeking a silver bullet and flopping around chasing after whatever the next big thing is or appears to be.  Apple has a vision, puts plans in place to achieve that vision, and sticks to those plans unless they see clearly that they're not going to achieve their goals in the time frame they need to.

    Sometimes the stock analysts and market gurus mistake Apple's steady perseverance towards achieving goals as an indicator that they are lagging behind. It's always more exciting and enticing for speculators to see companies making bold moves and promises to deliver the moon. Too often they deliver a rock, a slice off the moon, or a beta moon that hangs around for far too long. Apple typically delivers concrete solutions that have immediate consumable value. But Apple isn't immune from making big promises at times and trickling out partial solutions, especially on the software side of things. They're still much better on the hardware side and seem to have been sticking to a plan and execution process that is playing out remarkably well.

     Apple Intelligence is, imo, looking more like a trickle-out delivery with a lot of uncertainty about what it really brings to people's everyday life. I have no doubt that it will be a huge step forward, eventually. By limiting its rollout to only the newest of the new hardware iPhone platforms it will take a lot of slick demos and hands-on user experience feedback and personal stories to compel owners of iPhone 14 and older iPhones to upgrade sooner based on the availability of Apple Intelligence. When the iPhone 4s arrived with Siri I remember a bunch of us tech nerds gathered around and asking Siri silly questions. The amusement factor wore off pretty quickly to the point where having Siri on all the newer iPhones, iPads, and Mac was not really a big deal at all. It was still okay, but I expect a fair number of people like myself use it rarely and only for the most trivial things, like skipping a music track or sending a quick text while driving. Apple Intelligence must do better.
    Alex_Vwatto_cobra
  • Reply 13 of 19
    MarvinMarvin Posts: 15,515moderator
    dk49 said:
    Wasn't that "secret" stolen by ex Apple engineers who started Nuvia (now Qualcomm)? XElite apparently has better performance per watt.
    How did the X Elite chip catch up so quickly and has a far more powerful iGPU than what Apple chips have?
    In the higher performance models, they run the X Elite chip at >100W, which needs a copper heatsink to keep it cool and it performs like M3 Pro at this level. Max chips run faster than X Elite at the same or lower power.

    The X Elite GPU is < 5TFLOPs, M4 Max is ~18TFLOPs:
    https://www.notebookcheck.net/Qualcomm-Adreno-X1-85-4-6-TFLOPS-GPU-Benchmarks-and-Specs.850228.0.html

    https://gfxbench.com/device.jsp?benchmark=gfx50&did=119508313
    https://gfxbench.com/device.jsp?benchmark=gfx50&did=123984693

    A lot of the performance-per-watt comes from TSMC hardware so the same node of chips should get close to the same efficiency but Apple has an advantage with end-to-end hardware and software. They control, the OS, drivers, APIs, hardware, no other manufacturer has this.
    mainyehc said:
    y2an said:
    Sorry but this goes back to the PA Semi acquisition in 2008. A clear move to become wholly market leading in chip design. 
    It’s almost as if Apple execs always regarded the switch to Intel as a stopgap measure, i.e. being dependent upon Intel wasn’t any better than being dependent on IBM and Motorola, in the grand scheme of things, but it was just technologically expedient for the time being. Steve and his minions, and their successors alike, have always been playing a really long game of 4D chess.
    They originally wanted Intel to make the chips for the iPhone. If Intel had delivered on this, they probably would have stuck with them but there were too many issues with cost, development speed, IP, performance-per-watt.

    https://news.softpedia.com/news/Ex-Intel-Boss-Regrets-Saying-No-to-Steve-Jobs-iPhone-354197.shtml

    It took 10 years for Apple's mobile chips to scale to replace Intel chips in Macs, maybe it was planned but I think Intel just failed to deliver and it eventually made no sense to keep using them at all.
    thtradarthekatAlex_Vwatto_cobra
  • Reply 14 of 19
    mpantonempantone Posts: 2,274member
    y2an said:
    Sorry but this goes back to the PA Semi acquisition in 2008. A clear move to become wholly market leading in chip design. 
    Yes, that's probably a good assumption. And many said it would take about five years for that acquisition to bear fruit. However it's important to remember that many PA Semi employees left after the Apple acquisition, not all wanted to continue on at Apple. This was expected at the time.

    Remember how Steve described the iPhone: "the computer for the rest of us." Totally prescient.

    While Steve dies in 2011 undoubtedly most of what Apple is today was laid out in long-term roadmaps by him (high level things like Apple Silicon not minutiae like what colors the Apple Watch was going to be offered).
    edited November 2024 radarthekatwatto_cobra
  • Reply 15 of 19
    danoxdanox Posts: 3,479member
    dk49 said:
    Wasn't that "secret" stolen by ex Apple engineers who started Nuvia (now Qualcomm)? XElite apparently has better performance per watt.
    That's what I'm also thinking.  How did the X Elite chip catch up so quickly and has a far more powerful iGPU than what Apple chips have?  I'm sure that extra power must come at some cost in terms of battery life or thermals.  I'm still hoping ARM Holdings can delay the release of those Qualcomm chips due to a licensing dispute but I bet Qualcomm will simply end up paying the licensing fee.  Qualcomm always wins.

    The problem with the Qualcomm current generation chips is that their performance isn’t significantly better than the Intel or the AMD chips. In fact they are at the bottom Qualcomm talked a lot of trash about the performance of their chips (The wattage used is not much better than Intel and AMD) it turns out between them and Microsoft, it was essentially a total Recall when it came to performance, And now they’re back, talking up the second version of their chip which uses about the same power? Slightly different chip but the same trash talking.

    https://www.youtube.com/watch?v=5rN6CEO31gM   The Qualcomm current processor is at the bottom in almost all of the testing. The only thing it can claim lasting maybe 30 minutes longer than M4 in one battery test, and It’s behind every Windows laptop and that ain’t good.

    6:00 Performance per watt (M2 thru M4)

    6:37 (Power draw)

    9:00 (wildlife extreme Qualcomm is at the bottom)

    16:39 (battery life testing Wi-Fi) M2 thru M4

    In the comment section, there are many, very upset Windows users their minds are blown. The M2, M3, and M4 laptops made by Apple, are literally lapping the competition.

    This YouTube channel might be the best at testing no nonsense and to the point…..





    edited November 2024 thtwilliamlondonwatto_cobra
  • Reply 16 of 19
    Indeed Apple are way ahead. To me it was clear how this would play out back when Apple debuted the A4 chip in the iPhone 4. They designed it themselves in-house. That was a real "wow" moment and you knew Apple was going to have the best silicon. First to 64bit with the A7 as well. Then there was the T chips, which were also a breakthrough for security instead of the very poor "secure boot" rubbish that Windows uses. With the M chips they've really started to flex. And as Boger said, the rest of the industry was caught sleeping. They somehow failed to recognise what was happening as well las failed to at least match it. Even the latest snapdragon CPU's on mobile and PC are outpaced by Apple's A and M chips. Someone on twitter ran a test involving several smartphones in rendering video and images, two separate tests, and the A18 Pro chip outperformed even the new SD 8 Elite chip substantially. It wasn't even close. Something like 40 - 50 seconds faster. Plus there are users saying the SD 8 elite gets so hot in the hand when gaming it literally burns you after about 10 minutes.... The industry is fumbling about while Apple is racing ahead.
    Alex_Vwatto_cobra
  • Reply 17 of 19
    I wouldn't care if Apple's performance was always lagging. There's simply no competition in the privacy department to Apple. I would take privacy any day over speed.
    Alex_Vwatto_cobra
  • Reply 18 of 19
    entropysentropys Posts: 4,331member
    The only real downside is if software gets written for a vertically integrated system with minority market share. Not so much an issue for cloud services, and office software.

    But it can have a downside for certain applications. Gaming of course is obvious, but also real work that requires grunt, like CAD software. The big players just do not bother with Apple, even though the hardware would be a fantastic marriage with their software. 
    watto_cobra
  • Reply 19 of 19
    entropys said:
    The only real downside is if software gets written for a vertically integrated system with minority market share. Not so much an issue for cloud services, and office software.

    But it can have a downside for certain applications. Gaming of course is obvious, but also real work that requires grunt, like CAD software. The big players just do not bother with Apple, even though the hardware would be a fantastic marriage with their software. 
    There is some good news here though.  The ARM versions of Windows and Linux can be run under virtualization on macOS (multiple solutions are available), and Windows does have a Rosetta-like solution.  There are some other WINE-based solutions.  Depending on the OS-based needs of the software you need to run, one of these is likely to work fine... and the M4's very impressive CPU performance means that the performance impact of such cross-translation still leaves you with quite a usable result.  A lot of the software in question doesn't need anything close to the available compute power, which means that losing 20-50% doesn't actually matter.  And the M4's GPU is pretty potent too, so some of the translation implementations deliver perfectly acceptable results.  If modern games can run adequately, most productivity software is nowhere near as demanding.  CAD software used to tax GPUs 10-20 years ago, but now the hardware is extremely powerful by comparison.  If you have a program that taxes the latest GPUs when run natively, sure, the translation may by too big a hit... but the vast majority of cases, it isn't likely to be a problem.

    watto_cobra
Sign In or Register to comment.