Apple Silicon's success helped AMD make Ryzen AI Max chips

Posted:
in Current Mac Hardware

AMD's latest Ryzen AI Max chips probably wouldn't have existed without Apple, an AMD executive has admitted, thanks to the popularity of Apple Silicon.

Computer processor labeled AMD Ryzen AI Max Series, surrounded by circuitry, set against a dark background with orange diagonal lighting.
AMD Ryzen AI Max - Image Credit: AMD



At CES, AMD introduced Ryzen AI Max chips, an upgraded version of its Ryzen AI architecture with up to 16 CPU cores and up to 40 AMD RDNA 3.5 graphics compute units, and a neural processing unit with up to 50 trillion operations per second.

The chips, offering tons of performance in various ways in one focused component, has considerable echoes to the way Apple Silicon works. During the launch, AMD VP Joe Macri hinted that Apple Silicon helped with getting the product made and out the door, reports Engadget.

An Apple Silicon thing



"Many people in the PC industry said, well, if you want graphics,, it's gotta be discrete graphics because otherwise people will think it's bad graphics," Macri offered.

He continued "What Apple showed was consumers don't care what's inside the box. They actually care about what the box looks like. They care about the screen, the keyboard, the mouse. They care about what it does."

With Apple having a massive success on its hands with Apple Silicon, it allowed Macri to convince upper management to spend a "mind boggling" amount of resources to develop the Ryzen AI Max.

"I always knew, because we were building APUs, and I'd been pushing for this big APU forever, that I could build," Macri enthused. "A system that was smaller, faster, and I could give much higher performance at the same power."

Inspired, but not first



While Macri was complementary about Apple Silicon's success and how it helped convince others that Ryzen AI Max chips were a real possibility, he stops short of giving full credit to Apple.

He insists that AMD was working towards this scenario well ahead of Apple. "We were building APUs while Apple was using discrete GPUs," he crows, referring to chips that combined a CPU with Radeon graphics.

"They [Apple] were using our discrete GPUs. So I don't credit Apple with coming up with the idea," he continued.

Before implementing Apple Silicon, Apple did extensively use AMD Radeon GPUs as discrete graphics options in its MacBook Pro lines.

Apple may have had an interest in creating its own APU at the time AMD was working on the concept. In July 2012, former AMD chip architect John Bruno, who previously contributed to AMD's Trinity APU was spotted on LinkedIn as having become a "System Architect at Apple."

At one point, Apple apparently considered using the original AMD Fusion APU in the Apple TV in 2010, before eventually using its A4 processor.



Read on AppleInsider

Comments

  • Reply 1 of 19
    XedXed Posts: 2,921member
    Apple was using integrated graphics with Intel chips long before they moved to Apple Silicon. I even remember people on this board saying that Apple should only use discreet graphics because integrated graphics weren't good blah blah blah. I'm glad that Apple could prove that integrated graphics can have real benefits.
    williamlondonwatto_cobrakillroydewmeAlex_V
  • Reply 2 of 19
    blastdoorblastdoor Posts: 3,642member
    It’s definitely true that AMD has been focused on ‘APUs’ for a long time. It’s the reason they bought ATI. 

    But it’s also true (imho) that buying ATI was a huge waste of $5 billion that they should have spent updating the athlon to better compete with Core 2 duo and they should have built a new fab. That decision nearly bankrupted the company. 

    The idea for the APU wasn’t wrong, it was just way too premature and diverted resources from more important priorities. Kind of like apple introducing Newton way too early.
    watto_cobra
  • Reply 3 of 19
    thttht Posts: 5,767member
    There isn't anything to crow about here, either from Apple or AMD. Transistor budgets roughly double every 2 years or so. Having more and more dedicated function units added to the "CPU" has been a process that is ongoing for the past 40 years. NPUs are just the latest thing, with a lot of hype. In another time, they'd just be called specialized SIMD units.

    And, AMD has a lot of experience putting higher performance GPUs into an SoC. They provided the SoCs for both PS5 and XBox X-series, or in their language, APUs. There's a lot of technical heritage there.

    Management likely needs to be convinced to make a high performance x86 APU for PC laptops, though. I'd bet AMD managers are thinking it is crazy to spend a lot of R&D on a high performance APU or SoC in the laptop market. AMD seems to be failing or having a down turn in this market. It's a hyper mature market where the vast majority of laptops are neutered $600 devices that doesn't require much of a GPU at all. They are competing for a pretty small slice with very little margins.

    AMD is way way behind in the AI accelerator market. These are like $5000 modules that has been in hyper demand for the past couple of years, and is slated to grow. These are really high margin products. AMD not trying to cut into that market seems like a troublesome path. Those margins will pay for everything else. 
    apple4thewinmuthuk_vanalingamwatto_cobradewme
  • Reply 4 of 19
    tht said:
    There isn't anything to crow about here, either from Apple or AMD. Transistor budgets roughly double every 2 years or so. Having more and more dedicated function units added to the "CPU" has been a process that is ongoing for the past 40 years. NPUs are just the latest thing, with a lot of hype. In another time, they'd just be called specialized SIMD units.

    And, AMD has a lot of experience putting higher performance GPUs into an SoC. They provided the SoCs for both PS5 and XBox X-series, or in their language, APUs. There's a lot of technical heritage there.

    Management likely needs to be convinced to make a high performance x86 APU for PC laptops, though. I'd bet AMD managers are thinking it is crazy to spend a lot of R&D on a high performance APU or SoC in the laptop market. AMD seems to be failing or having a down turn in this market. It's a hyper mature market where the vast majority of laptops are neutered $600 devices that doesn't require much of a GPU at all. They are competing for a pretty small slice with very little margins.

    AMD is way way behind in the AI accelerator market. These are like $5000 modules that has been in hyper demand for the past couple of years, and is slated to grow. These are really high margin products. AMD not trying to cut into that market seems like a troublesome path. Those margins will pay for everything else. 
    Although AMD is well behind Intel in the laptop market, they are now the top choice in the desktop chip market for new buyers. The only reason why AMD still is far behind (although their market share is growing) is because Intel will put the weakest chips inside the cheapest devices, and people will buy because they don’t do any research.
    watto_cobrakillroydewme
  • Reply 5 of 19
    9secondkox29secondkox2 Posts: 3,197member
    LOL. 

    “Apple silicon is super rad and popular, so let’s say that similar thing we did was ahead of it’s time… because apple used discrete CPUs before snd we sold them to apple. Yeah. Cool. Say that. People will believe it and like us more.”

    -recent AMD board meeting. 
    watto_cobra
  • Reply 6 of 19
    blastdoorblastdoor Posts: 3,642member
    https://www.statista.com/statistics/1130315/worldwide-x86-intel-amd-laptop-market-share/

    AMD rose from about 8% to 20% from 2017 to 2020 and they've been floating around 20% since then. It's kind of amazing how well Intel has held onto the laptop market, but I'm pretty sure it has nothing to do with technical merit (although Lunar Lake is competitive in the thin and light segment). 
    watto_cobra
  • Reply 7 of 19
    eriamjheriamjh Posts: 1,782member
    Is AMD saying they copied Apple’s strategy?  

    It sure sounds like it.  
    watto_cobrakillroy
  • Reply 8 of 19
    XedXed Posts: 2,921member
    eriamjh said:
    Is AMD saying they copied Apple’s strategy?  

    It sure sounds like it.  
    They copied Apple’s strategy. Also, they did it first.
    blastdoornubuswatto_cobrajas99
  • Reply 9 of 19
    blastdoorblastdoor Posts: 3,642member
    Xed said:
    eriamjh said:
    Is AMD saying they copied Apple’s strategy?  

    It sure sounds like it.  
    They copied Apple’s strategy. Also, they did it first.
    Ha, yeah, it kind of reads that way. 

    Maybe the more 'honest' description is that AMD came up with the APU idea first but botched the implementation and nearly went bankrupt because of it. Apple ended up implementing the idea far better than AMD had, and AMD is now copying the superior implementation. 

    The funny part is that Apple never had to waste $5 billion buying a video card maker in order to do this. 
    watto_cobrakillroydanoxForumPostjas99
  • Reply 10 of 19
    Xed said:
    Apple was using integrated graphics with Intel chips long before they moved to Apple Silicon. I even remember people on this board saying that Apple should only use discreet graphics because integrated graphics weren't good blah blah blah. I'm glad that Apple could prove that integrated graphics can have real benefits.
    I think the anger was that they dumped NVIDIA for Intel’s Iris Plus integrated graphics card which did make a difference, especially if you were using 3D software. 

    There was also a lot of anger from Steve Jobs towards ATI and NVIDIA.  He didn’t like how an ATI executive leaked Apple would be using their chips   right before a product intro and decided to switch to NVIDIA right before the announcement from Apple. Then there is the bad blood between Apple and NVIDIA, where according to Amber Neeley’s article on Appleinsider, Steve Jobs had accused NVIDIA of stealing technology from Pixar.



    watto_cobra
  • Reply 11 of 19
    Amd is talking out of their butts again. 

    Apple has been doing the SOC thing way back since they combined Samsungs cpu with a power vr GPU in the iPhone. 

    Then They made their own and dominated iPhones and tablets. Then put it in Macs - and dominated the pc space. 

    Apples been not only talking about it, DOING it for a long time. 

    AMD needs to get honest about thst. They copied Apple’s idea. Only talking about the pc space is pretending that apple hasn’t been already doing this. They just had to wait until the tech was powerfully enough to whoop on AMD and Intel. Before putting the phone and tablet SOCs in Mac’s. 
    edited January 9 ForumPostwatto_cobra
  • Reply 12 of 19
    blastdoorblastdoor Posts: 3,642member
    Amd is talking out of their butts again. 

    Apple has been doing the SOC thing way back since they combined Samsungs cpu with a power vr GPU in the iPhone. 

    Then They made their own and dominated iPhones and tablets. Then put it in Macs - and dominated the pc space. 

    Apples been not only talking about it, DOING it for a long time. 

    AMD needs to get honest about thst. They copied Apple’s idea. Only talking about the pc space is pretending that apple hasn’t been already doing this. They just had to wait until the tech was powerfully enough to whoop on AMD and Intel. Before putting the phone and tablet SOCs in Mac’s. 
    AMD bought ATI, motivated by a vision for APUs, in 2006. 

    Also, note that the idea of an APU isn’t just ‘integrated graphics,’ it’s also ‘GPU compute’ — ie, using the GPU as a big SIMD unit. 

    So I think AMD has some legit bragging rights in terms of recognizing the value of the APU concept. 

    It’s a hollow brag, though, because their implementation wasn’t great for a long time and the money spent on ATI would have been better spent elsewhere. 

    Lisa Su (and Jim Keller) saved the company — barely — with Ryzen. But AMD is a shadow of the company it could have been had they not wasted 5 billion on ATI. 
    muthuk_vanalingamwatto_cobra
  • Reply 13 of 19
    mpantonempantone Posts: 2,274member
    This is such a strange statement from AMD, Apple was using integrated graphics in their early Intel-powered notebooks.

    Today's Apple Silicon is a descendant of the A-series SoCs so it's not like Apple had the sudden epiphany of integrated graphics when they launched the M-series silicon.

    This AMD guy is trying to take credit for something that wasn't AMD's doing. It's worth pointing out that Apple also used discrete Nvidia GPUs years ago. His braggadocio is nonsense.

    So weird.
    edited January 10 watto_cobra
  • Reply 14 of 19
    blastdoorblastdoor Posts: 3,642member
    mpantone said:
    This is such a strange statement from AMD, Apple was using integrated graphics in their early Intel-powered notebooks.

    Today's Apple Silicon is a descendant of the A-series SoCs so it's not like Apple had the sudden epiphany of integrated graphics when they launched the M-series silicon.

    This AMD guy is trying to take credit for something that wasn't AMD's doing. It's worth pointing out that Apple also used discrete Nvidia GPUs years ago. His braggadocio is nonsense.

    So weird.
    APU isn’t just integrated graphics. It’s GPU compute in the same memory space as the CPU. Apple first used opencl i(ie, gpu compute) n 2009 on the Mac and released metal in 2014. The first PowerVR GPU to support opencl was from 2011. 
    watto_cobra
  • Reply 15 of 19
    dewmedewme Posts: 5,824member
    As an Apple customer, all I can say is that Apple Silicon was a masterfully engineered and superbly executed innovation that will forever be seen as yet another major inflection point in computer evolution, and it was driven by Apple. It tamed the Power Monster that looked like it would be an insurmountable brick wall to further performance improvements, unless of course we considered using liquid nitrogen cooling systems for personal computers. Apple Silicon may end up being "Peak Silicon" because the silicon fabrication technology looks like it will hit an atomic limit in the near future. 

    No amount of revisionism will dull the shine on what Apple has accomplished.
    watto_cobraAlex_V
  • Reply 16 of 19
    mpantonempantone Posts: 2,274member
    blastdoor said:
    mpantone said:
    This is such a strange statement from AMD, Apple was using integrated graphics in their early Intel-powered notebooks.

    Today's Apple Silicon is a descendant of the A-series SoCs so it's not like Apple had the sudden epiphany of integrated graphics when they launched the M-series silicon.

    This AMD guy is trying to take credit for something that wasn't AMD's doing. It's worth pointing out that Apple also used discrete Nvidia GPUs years ago. His braggadocio is nonsense.

    So weird.
    APU isn’t just integrated graphics. It’s GPU compute in the same memory space as the CPU. Apple first used opencl i(ie, gpu compute) n 2009 on the Mac and released metal in 2014. The first PowerVR GPU to support opencl was from 2011. 
    Integrated graphics on Intel CPUs use system memory, just like the CPU cores. Anyhow, this AMD guy is trying to take credit for something that wasn't pioneering. And how is AMD's APU business doing (other than videogame consoles)?

    The fact of the matter is its not an original concept by far. Many teams were working on this simultaneously. Remember that SGI had unified memory architecture (UMA) in their O2 workstation in the mid Nineties. It wasn't feasible to put this all in one package back then; it just took time and then there were multiple versions from various vendors.

    The silicon concept was not revolutionary. Apple was brilliant enough to put it in a smartphone "the computer for the rest of us" and focus on the software. Joe Consumer doesn't really care whether or not all the transistors in the same package or not, as the AMD guy mentions. Consumers care what it does.

    Anyhow in a couple of weeks, Apple will announce another quarter of great results and this AMD guy's comments will be a faint memory for all but a handful of industry pundits. We know he's playing buzzword bingo at CES but c'mon now.
    edited January 10 blastdoorwatto_cobra
  • Reply 17 of 19
    thttht Posts: 5,767member
    Is my memory fooling me, here? I thought Apple had a PowerPC system where the GPU was in the equivalent of the Northbridge chip? Was that GPU  ATI or Nvidia? Apple designed? 

    I’m doubting the memory as it means Apple contracted out a memory controller + GPU + IO chip to a third party? Or ATI or Nvidia gave their design to Apple to put into this chip?

    It was just an interesting architecture. Apple’s T2 was a kind of advanced “Southbridge” chip, a kind of successor to this integrated chip, that was also eventually folded into the SoC on Apple silicon. Long history of consolidating function into single chips, and gradually migrating into the SoC chip. 

    There was a time that you could buy L2 SRAM/cache memory as an after-market product!
  • Reply 18 of 19
    corp1corp1 Posts: 102member
    Apple didn't invent ARM (though they invested in it), nor did they invent SoC/SiP designs, nor did they invent unified memory designs.
    Instead, they combined these things together in A-series processors and eventually the M1/MacBook Air, which was an absolute home run.
    Apple's claim to fame is innovation – figuring out how to apply technology effectively to create best-in-class devices.

    edited January 11 williamlondon
  • Reply 19 of 19
    I think this is great news, Its forcing AMD| to be more competitive, It will help prevent Apple from resting on it's laurels, Intel will need to keep pursing features like less power draw, RiSC-V developers will need to work harder. Regardless if you are team Red or Blue, Team Fruit or Team Open Source, it benefits all of us consumers. 
Sign In or Register to comment.