AMD CEO says Apple's M1 chip is opportunity to innovate, underscores ongoing graphics part...

Posted:
in Current Mac Hardware edited January 12
AMD CEO Dr. Lisa Su sat down with journalists in a Q&A session following a keynote address at CES 2021 on Tuesday, fielding a variety of questions including a request to comment on Apple's first foray into the desktop processor space.

M1


Speaking to press via conference call, Su addressed a range of questions regarding AMD's upcoming plans, the x86 platform and new developments in a highly competitive semiconductor market.

Dr. Ian Cutress of AnandTech focused on the emergence of ARM processor designs. According to Cutress, ARM models are expected to greatly boost compute performance in the coming years and could begin to encroach on territory long held by x86 makers like AMD and Intel. ARM silicon is typically used in specialized deployments like servers, but the chip designs are now starting to show up in consumer products.

Apple, for example, introduced the M1 chip in its late-2020 13-inch MacBook Pro, MacBook Air and Mac mini models. The tech giant plans its entire lineup of Mac computers to run on custom ARM chips within two years. That presents an immediate loss in revenue for current CPU partner Intel, but also creates headwinds for the wider x86 market.

Su was asked how the M1 will impact AMD's relationship with Apple.

"The M1 is more about how much processing and innovation there is in the market. This is an opportunity to innovate more, both in hardware and software, and it goes beyond the ISA," Su said. "From our standpoint, there is still innovation in the PC space - we have lots of choices and people can use the same processors in a lot of different environments. We expect to see more specialization as we go forward over the next couple of years, and it enables more differentiation. But Apple continues to work with us as their graphics partner, and we work with them."

Apple relies on AMD's Radeon graphics cards to power high-end devices like MacBook Pro, iMac and Mac Pro, but that could change with a shift to in-house solutions. M1 Macs integrate Tile Based Deferred Rendering (TBDR) graphics cores on a system-on-chip design similar to the A-series processors used in iPhone and iPad.

While Apple is content to stick with integrated graphics for the initial wave of M1 Macs, it is possible that the company is working toward a dedicated GPU to better serve high-performance machines.

Apple's transition to ARM appears to be applying pressure to industry incumbents. On Monday, Intel detailed its forthcoming Alder Lake chip series, which seemingly takes a page out of the Apple Silicon strategy book by stretching use cases from mobile to desktop.
dewme
«1

Comments

  • Reply 1 of 23
    While Alder Lake is similar in concept to M1, I don’t think it’s fair to suggest that Intel Alder Lake design is a knee jerk reaction to (or plagiarism of M1). Silicon design takes years, from conceptual design to manufacturing to baking in support in the operating system. 

    Alder Lake is an iteration on the Hybrid Lakefield design that was released before M1 even made it to market that had one big sunny cove core and smaller atom cores. One could argue that Lakefield was just a test run to get working silicon out there. To enable Intel to work with Microsoft (and possibly Linux) to begin work on updating the OS kernel scheduler for efficiently scheduling processes and threads on a hybrid x86 cpu. Alder Lake was conceptualized long before it enters high volume manufacturing in 2021. 

    Of course with Apple, M1 is indeed amazing, and the competitive advantage Apple has over intel is it’s vertical integration allows it to design and co-optimize the kernel of the OS in conjunction with the silicon design. Also, TSMC is crushing it right now in terms of manufacturing on a leading edge node...whereas intel is finally starting to ramp 10 nm. AMD doesn’t seem to think hybrid x86 is all that relevant or necessary right now. So we will have to see how Alder Lake performs. It seems impressive, it will have IPC improvements above tiger lake, which in itself isn’t too shabby. Very interesting times ahead for the PC/Mac space. 

    Also, good to know that AMD is still working with Apple. Now all they need to do is release the drivers for Big Navi in Big Sur... 
    edited January 13 rundhvidgregoriusmwilliamlondondewmewatto_cobrajony0
  • Reply 2 of 23
    22july201322july2013 Posts: 2,521member
    Intel's new offering is fine and good, but it also takes a lot of work from the OS and the apps to make this work properly. It's not even clear yet that the OS developers for Intel CPUs (e.g., Windows, Linux, [potentially even an upcoming Apple OS for Intel], etc.) care to use this feature. But if they do, it will take time to implement.

    To work best it also takes a little cooperation from application developers, because there's always some hooks in the OS's APIs which allow apps to indicate whether they prefer to run in the high performance or high efficiency cores. (I remember seeing a description of this API for macOS in a WWDC video, but I couldn't find a link for it to include here.)

    As far as I know, so far, only Apple provides an OS for M1-based Macs. And it obviously includes all the OS features including the API hooks to make this work for the two types of cores in M1 chips. But if another OS comes out for M1-based Macs, it's not mandatory that the new OS provides the same features as macOS in terms of either the OS or the apps correctly choosing between the high efficiency and the high performance cores.

    [edited to remove political commentary: RadarTheKat, moderator]
    edited January 13 FrancescoBFidonet127mobirdwatto_cobra
  • Reply 3 of 23
    While Alder Lake is similar in concept to M1, I don’t think it’s fair to suggest that Intel Alder Lake design is a knee jerk reaction to (or plagiarism of M1). Silicon design takes years, from conceptual design to manufacturing to baking in support in the operating system. 

    Alder Lake is an iteration on the Hybrid Lakefield design that was released before M1 even made it to market that had one big sunny cove core and smaller atom cores. One could argue that Lakefield was just a test run to get working silicon out there. To enable Intel to work with Microsoft (and possibly Linux) to begin work on updating the OS kernel scheduler for efficiently scheduling processes and threads on a hybrid x86 cpu. Alder Lake was conceptualized long before it enters high volume manufacturing in 2021. 

    Of course with Apple, M1 is indeed amazing, and the competitive advantage Apple has over intel is it’s vertical integration allows it to design and co-optimize the kernel of the OS in conjunction with the silicon design. Also, TSMC is crushing it right now in terms of manufacturing on a leading edge node...whereas intel is finally starting to ramp 10 nm. AMD doesn’t seem to think hybrid x86 is all that relevant or necessary right now. So we will have to see how Alder Lake performs. It seems impressive, it will have IPC improvements above tiger lake, which in itself isn’t too shabby. Very interesting times ahead for the PC/Mac space. 

    Also, good to know that AMD is still working with Apple. Now all they need to do is release the drivers for Big Navi in Big Sur... 
    You're right, its not a knee-jerk reaction to the M1. It's a long term reaction to the A-Series (M1) chips that have over a decade met and now surpassed PC chips. They all saw it coming, yet they are behind and it's the x86 instruction set that is crippling them.
    williamlondonkingofsomewherehotmcdavewatto_cobra
  • Reply 4 of 23
    You're right, its not a knee-jerk reaction to the M1. It's a long term reaction to the A-Series (M1) chips that have over a decade met and now surpassed PC chips. They all saw it coming, yet they are behind and it's the x86 instruction set that is crippling them.
    If and when x86 chips are produced on a smaller manufacturing node (e.g., 3 nm) we’ll see if it’s the x86 ISA holding the platform back or if a newer manufacturing node helps increase efficiency and performance per watt. Either way, ARM’s efficiency is not in dispute. It’s very impressive what apple has done. 
    edited January 13 muthuk_vanalingamwatto_cobra
  • Reply 5 of 23
    Sigh. Even Ars Technica claims that Intel's Alder Lake is a response or was influenced by/a ripoff of the M1. Shows you how the media just puts a pro-Apple spin on everything. In reality Intel is merely adopting the big.LITTLE architecture that has been standard in ARM/RISC designs of more than 4 cores for years. For example, the Qualcomm 808 used a big.LITTLE design way back in 2015. Back then, the Apple A9 was still a dual core chip (both Firestorm or equivalent).

    Also, as someone above truthfully stated, wait until we get a 10nm octacore chip on Alder Lake architecture, which should occur in 2022. That is when we will FINALLY be able to get an apples to apples comparison (pun not intended) on performance and power.

    The Alder Lake CPUs will be primarily used in 2-in-1 ultraportables that will compete with the MacBook Air (except they will have touchscreens, 2-in-1 designs and some will even have 5G) and in gaming laptops. So if you want a 2-in-1 or to play Steam games, you aren't going to by an M1 MacBook Air no matter how much faster it runs or less power it uses because that M1 MacBook Air won't have the features, functionality or ability to run the software that motivates people to buy the Windows and Chromebook (Chromebooks will gain the ability to run Steam this year) competition. When you add the fact that the Alder Lake competition will also be several hundred dollars cheaper than an entry level MBA, this will be still more reasons why the M1 won't change Apple's 7%-8% market share. 
    elijahgprismatics
  • Reply 6 of 23
    Light years behind ARM and decades behind Apple's designs? Sure buddy. For the millionth time, the problem is not Intel's ability to design a 10nm, 7nm or 5nm chip but their inability to manufacture one. Intel could sign a contract to have Samsung manufacture their 5nm chips tomorrow and those chips would be available to purchase in 6 months. Intel won't do this because their chips still have market share that range from 50% (desktop) to 75% (server/cloud). Also, Intel wants consumers to buy the 14nm and 10nm chips that they still have in inventory and are still manufacturing, meaning that Intel won't be in any real rush to get to 7nm and 5nm until about 2023 anyway. Otherwise they will lose more money on their current chips than they will make on the new ones.

    Bottom line: more Chromebooks with Intel chips are going to sell in 2021 than Macs with M1 (or M1X or M2) chips. Apple isn't Intel's competition and those of you who insist on believing otherwise are just setting yourselves up for disappointment. My guess is that in 2 years all of these columns on the coming Apple dominance are going to just quietly disappear, just as the ones from 2012 on how iPads were going to crush Windows and 2014 on how Android, Google and Samsung were on their last legs and 2016 on how the iPad had crushed console gaming and Apple was going to buy what was left of Nintendo in order to exclusively feature them on iPads and Apple TV did.
    edited January 13 muthuk_vanalingamelijahgprismatics
  • Reply 7 of 23
    Tip for the reporters: You should ask the same question of NVIDIA. The answer may surprise you.
  • Reply 8 of 23
    It goes without saying intel has stumbled. They are in a less than optimal place because they have not kept up. You can appreciate them but it’s a tad silly to pretend they are leading anything at this point. They even admit that. In 2 years everything will be different. In 2 years if intel is only caught up to what’s happening now they’ll be further behind.


    cloudguy said:
    Light years behind ARM and decades behind Apple's designs? Sure buddy. For the millionth time, the problem is not Intel's ability to design a 10nm, 7nm or 5nm chip but their inability to manufacture one. Intel could sign a contract to have Samsung manufacture their 5nm chips tomorrow and those chips would be available to purchase in 6 months. Intel won't do this because their chips still have market share that range from 50% (desktop) to 75% (server/cloud). Also, Intel wants consumers to buy the 14nm and 10nm chips that they still have in inventory and are still manufacturing, meaning that Intel won't be in any real rush to get to 7nm and 5nm until about 2023 anyway. Otherwise they will lose more money on their current chips than they will make on the new ones.

    Bottom line: more Chromebooks with Intel chips are going to sell in 2021 than Macs with M1 (or M1X or M2) chips. Apple isn't Intel's competition and those of you who insist on believing otherwise are just setting yourselves up for disappointment. My guess is that in 2 years all of these columns on the coming Apple dominance are going to just quietly disappear, just as the ones from 2012 on how iPads were going to crush Windows and 2014 on how Android, Google and Samsung were on their last legs and 2016 on how the iPad had crushed console gaming and Apple was going to buy what was left of Nintendo in order to exclusively feature them on iPads and Apple TV did.

    watto_cobra
  • Reply 9 of 23
    elijahgelijahg Posts: 2,332member
    cloudguy said:
    Bottom line: more Chromebooks with Intel chips are going to sell in 2021 than Macs with M1 (or M1X or M2) chips. Apple isn't Intel's competition and those of you who insist on believing otherwise are just setting yourselves up for disappointment. My guess is that in 2 years all of these columns on the coming Apple dominance are going to just quietly disappear, just as the ones from 2012 on how iPads were going to crush Windows and 2014 on how Android, Google and Samsung were on their last legs and 2016 on how the iPad had crushed console gaming and Apple was going to buy what was left of Nintendo in order to exclusively feature them on iPads and Apple TV did.
    It's almost like you've been reading DED's ridiculous articles. They seem to have pretty much disappeared nowadays.

    IMO Apple's share will actually fall eventually, as switchers like having Windows availability as a backup. If Windows support for ARM improves, then perhaps things will be different. A fast CPU is great, but no good if you can't run software you need on it, which is the exact issue Apple had in the mid to late 90's with Classic Mac OS. The CPUs were markedly quicker than Intel offerings but the software support was so crap that few outside of a hardcode base used Macs. Intel had similar when trying to push MS and manufacturers to Itanium in the mid-2000s, which is why they were late to the party with x64. The world needs to bite the bullet and switch architectures, IPv6 is a similar switch - it's happening, slowly, which would be the case too if there was a viable non-x86 architecture running Windows at a reasonable speed.
    rezwitsprismatics
  • Reply 10 of 23
    elijahgelijahg Posts: 2,332member

    Intel's new offering is fine and good, but it also takes a lot of work from the OS and the apps to make this work properly. It's not even clear yet that the OS developers for Intel CPUs (e.g., Windows, Linux, [potentially even an upcoming Apple OS for Intel], etc.) care to use this feature. But if they do, it will take time to implement.

    To work best it also takes a little cooperation from application developers, because there's always some hooks in the OS's APIs which allow apps to indicate whether they prefer to run in the high performance or high efficiency cores. (I remember seeing a description of this API for macOS in a WWDC video, but I couldn't find a link for it to include here.)

    As far as I know, so far, only Apple provides an OS for M1-based Macs. And it obviously includes all the OS features including the API hooks to make this work for the two types of cores in M1 chips. But if another OS comes out for M1-based Macs, it's not mandatory that the new OS provides the same features as macOS in terms of either the OS or the apps correctly choosing between the high efficiency and the high performance cores.

    [edited to remove political commentary: RadarTheKat, moderator]
    The commonality of CPUs with low and high power cores will push OSs to support it natively with little support from developers required. Much like the dual-GPU Macbooks which auto-switched depending on GPU load.
    prismatics
  • Reply 11 of 23
    mcdavemcdave Posts: 1,760member
    While Alder Lake is similar in concept to M1, I don’t think it’s fair to suggest that Intel Alder Lake design is a knee jerk reaction to (or plagiarism of M1). 
    Indeed, it’s more likely a response to the A-Series SoCs.
    watto_cobra
  • Reply 12 of 23
    mcdavemcdave Posts: 1,760member
    cloudguy said:
    Sigh. Even Ars Technica claims that Intel's Alder Lake is a response or was influenced by/a ripoff of the M1. Shows you how the media just puts a pro-Apple spin on everything. In reality Intel is merely adopting the big.LITTLE architecture that has been standard in ARM/RISC designs of more than 4 cores for years. For example, the Qualcomm 808 used a big.LITTLE design way back in 2015. Back then, the Apple A9 was still a dual core chip (both Firestorm or equivalent).

    Also, as someone above truthfully stated, wait until we get a 10nm octacore chip on Alder Lake architecture, which should occur in 2022. That is when we will FINALLY be able to get an apples to apples comparison (pun not intended) on performance and power.
    I think you need to look back at the performance of the 808 and ask yourself if Intel really cared. My thinking is not. Only Apple have ever indicated the ARM ISA could ever be a threat.

    A quick look at the TDPs for Alder Lake shows Apple has little to worry about as these are usually unrealistic/fraudulent power consumption figures bearing little resemblance to the actual load. Once you can acknowledge this it’s easy to see why Intel hasn’t simply outsourced to a 3rd party fab; because the “Apple to Apples” node comparison won’t work well for them.

    Such a comparison is irrelevant anyway as Apple’s key advantage is delegating performance critical tasks to dedicated silicon such as 4K HEVC/ProRes video workflows on their entry-level products.

    watto_cobra
  • Reply 13 of 23
    mcdavemcdave Posts: 1,760member
    elijahg said:
    IMO Apple's share will actually fall eventually, as switchers like having Windows availability as a backup. If Windows support for ARM improves, then perhaps things will be different.
    I’m unconvinced current Mac sales are driven by anything to do with Windows. Initially it may have served as a psychological safety net but eventually (as legacy apps were migrated to the web) I think most people realised what Mac users have known for a while - almost nobody needs Windows.  I’m unsure the Parallels/Fusion user base was ever that comparable to the macOS install base.
    Apparently, Apple thinks so too. 
    tmaywatto_cobratechconc
  • Reply 14 of 23
    mcdavemcdave Posts: 1,760member
    I’m intrigued by the GPU comment. There’s nothing in software to suggest any future for AMD GPUs on the Mac except on the Mac Pro or a last gasp with the 6000 series on the final Intel 16” MBPs.

    A GPU core-bump for the M1 should match any AMD mobile offering & I think Apple are more than capable of building high-performance dGPUs to compliment their iGPUs for round-trippable workloads.
    watto_cobra
  • Reply 15 of 23
    wizard69wizard69 Posts: 13,377member
    cloudguy said:
    Sigh. Even Ars Technica claims that Intel's Alder Lake is a response or was influenced by/a ripoff of the M1. Shows you how the media just puts a pro-Apple spin on everything. In reality Intel is merely adopting the big.LITTLE architecture that has been standard in ARM/RISC designs of more than 4 cores for years. For example, the Qualcomm 808 used a big.LITTLE design way back in 2015. Back then, the Apple A9 was still a dual core chip (both Firestorm or equivalent).

    Also, as someone above truthfully stated, wait until we get a 10nm octacore chip on Alder Lake architecture, which should occur in 2022. That is when we will FINALLY be able to get an apples to apples comparison (pun not intended) on performance and power.

    The Alder Lake CPUs will be primarily used in 2-in-1 ultraportables that will compete with the MacBook Air (except they will have touchscreens, 2-in-1 designs and some will even have 5G) and in gaming laptops. So if you want a 2-in-1 or to play Steam games, you aren't going to by an M1 MacBook Air no matter how much faster it runs or less power it uses because that M1 MacBook Air won't have the features, functionality or ability to run the software that motivates people to buy the Windows and Chromebook (Chromebooks will gain the ability to run Steam this year) competition. When you add the fact that the Alder Lake competition will also be several hundred dollars cheaper than an entry level MBA, this will be still more reasons why the M1 won't change Apple's 7%-8% market share. 
    Thank you for saying this!    Big.LITTLE never was an Apple thing
    elijahgprismatics
  • Reply 16 of 23
    I’m simply not sure whether AI can report on x86 things as it’s really hard to read every time for the reasons listed in the comments above and more.

    There is more to x86 than being some copycat. In fact the A /M series cores have been shown to closely follow the principles laid out by x86 cores in the past on a microarchitectural level. I just believe it would not be convenient to write about it.
    edited January 14
  • Reply 17 of 23
    elijahgelijahg Posts: 2,332member
    mcdave said:
    elijahg said:
    IMO Apple's share will actually fall eventually, as switchers like having Windows availability as a backup. If Windows support for ARM improves, then perhaps things will be different.
    I’m unconvinced current Mac sales are driven by anything to do with Windows. Initially it may have served as a psychological safety net but eventually (as legacy apps were migrated to the web) I think most people realised what Mac users have known for a while - almost nobody needs Windows.  I’m unsure the Parallels/Fusion user base was ever that comparable to the macOS install base.
    Apparently, Apple thinks so too. 
    I know a number of people who use Macs with Windows, myself included. I used to run a Mac network of 100 Macs, and we used to dual boot or use Fusion. We wouldn’t have bought Macs if we couldn’t do that. I’m quite disheartened to personally know more people switching away from Macs nowadays then to them. 

    Apple doesn’t give two shits if it snubs a proportion of their user base, nor how many people it effects. With the Mac being barely a blip on their profits they don't really have to care. It tends to be the power users they snub too, the same ones that have traditionally been loyal to Apple and followed them through their dark years in the 90’s. They do it all the time (iMovie 7, Final Cut X, Shake, Aperture, Mac Pro, twice, iMac Pro, VR, iPhone headphone jack to name a few).

    Steve’s Apple was about making the best product for the users (including compatibility), Cook’s Apple is about making as much money as possible for his investors. That and using Apple as a political platform are his only concerns.
    edited January 14
  • Reply 18 of 23
    cloudguy said:
    Sigh. Even Ars Technica claims that Intel's Alder Lake is a response or was influenced by/a ripoff of the M1. Shows you how the media just puts a pro-Apple spin on everything. In reality Intel is merely adopting the big.LITTLE architecture that has been standard in ARM/RISC designs of more than 4 cores for years. For example, the Qualcomm 808 used a big.LITTLE design way back in 2015. Back then, the Apple A9 was still a dual core chip (both Firestorm or equivalent).
    True.  The Big.little concept wasn't invented by Apple.  It wasn't invented by Qualcomm either.  ARM pioneered this approach 10 years ago.  What companies like Apple have done is intelligently add the logic into the OS to decide when and how to use it.  That doesn't come automatically with the hardware. Windows needs to learn how to deal with this properly, etc.  Apple didn't get it right the first time out of the gate either.  

    Where I would give Apple credit is for pushing this approach into the PC market first.  To date, the prevailing thought in the PC market was that homogeneous multi-processing was the only way to go for PCs.  Apple has demonstrated how their approach greatly impacts overall battery life.  Welcome to 2011 Intel.
    Also, as someone above truthfully stated, wait until we get a 10nm octacore chip on Alder Lake architecture, which should occur in 2022. That is when we will FINALLY be able to get an apples to apples comparison (pun not intended) on performance and power.
    If you think 2022 Alderlake Intel chips are going to be competitive with 2022 Apple Silicon, then I have a bridge you might be interested in buying.
    elijahg said:
    IMO Apple's share will actually fall eventually, as switchers like having Windows availability as a backup. If Windows support for ARM improves, then perhaps things will be different. A fast CPU is great, but no good if you can't run software you need on it, which is the exact issue Apple had in the mid to late 90's with Classic Mac OS. 
    I'd take that bet.  Dual booting into Windows is a novelty that lots of people talk about but few people actually do.  People will stay with Mac and possibly even increase in market share provided there are advantages for doing so.  There has always been an OS advantage which is what attracts Mac users in the first place.  Now, the hardware advantage, particularly for laptops (which sell the most) is so much greater with Apple Silicon, that Apple's share will only grow, not decline as you suggest.

    mcdave said:
    I’m unconvinced current Mac sales are driven by anything to do with Windows. Initially it may have served as a psychological safety net but eventually (as legacy apps were migrated to the web) I think most people realised what Mac users have known for a while - almost nobody needs Windows.  I’m unsure the Parallels/Fusion user base was ever that comparable to the macOS install base.
    Apparently, Apple thinks so too. 
    100% agreed.

    mcdave said:
    I’m intrigued by the GPU comment. There’s nothing in software to suggest any future for AMD GPUs on the Mac except on the Mac Pro or a last gasp with the 6000 series on the final Intel 16” MBPs.

    A GPU core-bump for the M1 should match any AMD mobile offering & I think Apple are more than capable of building high-performance dGPUs to compliment their iGPUs for round-trippable workloads.
    My initial thought with the Apple's announcement to move to their own silicon is that they's still use AMD for their top end Pro line.  After thinking about it some more, I've turned full circle and don't expect they will rely on AMD at all going forward.  GPUs are inherently parallel by design and when you rewatch Apple's videos including WWDC, etc. you see that Apple is very interested in demonstrating how scalable their technology is.  No, it seems to me that Apple wants to take all of this in-house.  This way, they can maximize their optimization for one brand of video cards, etc.  Also notice how there is no provision for eGPUs with the new Apple Silicon machines.  I think this very clearly signals Apple's intentions. 
  • Reply 19 of 23
    crowleycrowley Posts: 8,755member
    techconc said:
    mcdave said:
    I’m intrigued by the GPU comment. There’s nothing in software to suggest any future for AMD GPUs on the Mac except on the Mac Pro or a last gasp with the 6000 series on the final Intel 16” MBPs.

    A GPU core-bump for the M1 should match any AMD mobile offering & I think Apple are more than capable of building high-performance dGPUs to compliment their iGPUs for round-trippable workloads.
    My initial thought with the Apple's announcement to move to their own silicon is that they's still use AMD for their top end Pro line.  After thinking about it some more, I've turned full circle and don't expect they will rely on AMD at all going forward.  GPUs are inherently parallel by design and when you rewatch Apple's videos including WWDC, etc. you see that Apple is very interested in demonstrating how scalable their technology is.  No, it seems to me that Apple wants to take all of this in-house.  This way, they can maximize their optimization for one brand of video cards, etc.  Also notice how there is no provision for eGPUs with the new Apple Silicon machines.  I think this very clearly signals Apple's intentions. 
    I'm not thrilled about it, but I think you're probably right.  I wonder if the development of the MPX module and the Afterburner card were partially motivated by Apple wanting to dip their toe in and test their prowess in the high end PCIe video and graphics hardware space.

    I'm not thrilled about it as I really like my current setup of notebook for mobility, and eGPU for home working/playing on the same machine.  Though they may well be very capable I can't imagine the GPUs in Apple's mobile Macs aren't going to be able to match what can be done with a full size graphics card in an enclosure.  Plus, upgradeability is nice too.  Hopefully I'll be proved wrong.
    elijahg
  • Reply 20 of 23
    crowley said:
    I'm not thrilled about it, but I think you're probably right.  I wonder if the development of the MPX module and the Afterburner card were partially motivated by Apple wanting to dip their toe in and test their prowess in the high end PCIe video and graphics hardware space.

    I'm not thrilled about it as I really like my current setup of notebook for mobility, and eGPU for home working/playing on the same machine.  Though they may well be very capable I can't imagine the GPUs in Apple's mobile Macs aren't going to be able to match what can be done with a full size graphics card in an enclosure.  Plus, upgradeability is nice too.  Hopefully I'll be proved wrong.
    Yeah, the "Pro" solution certainly has a lot of questions to be answered. There is a lot you can do with an SoC, but there are also limits to that as well.  I don't think their Mac Pro replacement will just have a bigger SoC.  They are going to need some level of modularity if they're going to address that market.  Even from a memory perspective, you're not going to get 1.5 TB on a SoC either.  At some point, they're going to have to show us an architecture that allows for modular GPU and RAM.  As for the afterburner, they could probably just bake in some form of ProRES hardware encoder/decoder into their Pro level SoCs.  That's effectively all they are used for anyway.

    The bottom line is that the Mac Pro already has an existing bar that is set.  If Apple is going to address this market at all, it's going to have to rise above that bar.  Likewise, I'm optimistic about what they will bring to market, if a bit curious as to how they are going to do it.
Sign In or Register to comment.