AMD launches RX 5000-series graphics cards with 7nm Navi GPUs

Posted:
in General Discussion
AMD has teased its next generation of graphics cards using the Navi GPU as the RX 5000-series, offering performance and power improvements over its Graphics Core Next offerings, with the first RX 5700 card in the range set to arrive this July.




Announced at Computex on Monday during AMD's keynote, the RX 5000 series is using a new architecture called Radeon DNA (RDNA), which will be used alongside the more established Graphics Core Next architecture. As part of the architecture change, the new GPU called Navi will be used, which is said to offer considerable improvements over its predecessors.

The compute units within Navi have been refined from the versions used for Vega, to make each as efficient as possible. In theory, a Navi compute unit will be 25% faster than a Vega equivalent per clock, with a 50% improvement per watt of power used, in part due to AMD's switch to using TSMC's 7-nanometer production process.

At the same time, AMD has performed more refinement of its graphics pipeline to specifically cope with high clock speeds, along with introducing a multi-level cache to minimize bottlenecks.

To accompany the new architecture, AMD is also adding support for GDDR6 memory to the platform, further improving the memory bandwidth from that offered by GDDR5-based cards. Lastly, the GPU supports PCIe 4.0, offering the promise of increased bandwidth on systems that support that connectivity.

AMD's first card using the Navi GPU will be the RX 5700, which it will be launching in July. Details about the card were not advised, but more information will apparently be offered during E3 on June 10.

While AMD is one of two main companies in the graphics card industry worth watching, it is largely the only one of interest for Mac users. Modern Nvidia GPUs are not able to be used on macOS without some considerable effort, as Apple has made the decision to not approve Nvidia's drivers, effectively making AMD's cards the default option.
«1

Comments

  • Reply 1 of 30
    macroninmacronin Posts: 1,174member
    Hopefully we get some advance notice about Navi at WWDC, while Tim is previewing the modular Mac Pro...!
    ericthehalfbeeMacProLordeHawkwatto_cobra
  • Reply 2 of 30
    elijahgelijahg Posts: 1,007member
    The chance of Apple using them in anything in the near future? Nil. They use the Pro Vega 48 (as an option) in the top end iMac, whilst offering the 56 and 64 in the iMac Pro, essentially the same chips but progressively more locked down. Probably an artificial limitation to differentiate the iMac Pro. I bought a 2019 27" iMac recently, and I would have much rathered a Nvidia GPU than the gimped Vega 48 I have now.

    It'll also be interesting to see if these new AMD chips are anywhere near as fast as Nvidia's current chips, which are much faster and run much cooler than AMD's and have done for a number of years. Apple's ridiculous ongoing spats with the only two viable GPU manufacturers is so inane and childish, ever since about 2003 when AMD accidentally leaked details of a new PowerBook, Apple's been back and forth refusing to deal with one or the other. I also have no idea why Apple is refusing to sign Nvidia's drivers, but it's a pretty low blow, especially since Nvidia support cards going back 7 or 8 years and often release Mac versions of their PCIe cards, when AMD doesn't.

    Yet again Apple's political position with another company is harming its customers.
    edited May 27 xyzzy01williamlondonrunswithfork
  • Reply 3 of 30
    kruegdudekruegdude Posts: 340member
    It would be interesting to really know what’s keeping the NVIDIA drivers from being approved. I’ve read all the linked articles and the articles linked from those articles and it’s a lot of inference based on information from sources in engineering that say they don’t know who or why the decision to exclude the drivers has been made in upper levels of Apple management plus some historical information about the problems Apple has had with their use of NVIDIA chips.  There’s also a vague statement from NVIDIA saying that Apple is not approving their drivers but that doesn’t say anything. 
    runswithforkwatto_cobra
  • Reply 4 of 30
    Mike WuertheleMike Wuerthele Posts: 4,894administrator
    elijahg said:
    The chance of Apple using them in anything in the near future? Nil. They use the Pro Vega 48 (as an option) in the top end iMac, whilst offering the 56 and 64 in the iMac Pro, essentially the same chips but progressively more locked down. Probably an artificial limitation to differentiate the iMac Pro. I bought a 2019 27" iMac recently, and I would have much rathered a Nvidia GPU than the gimped Vega 48 I have now.

    It'll also be interesting to see if these new AMD chips are anywhere near as fast as Nvidia's current chips, which are much faster and run much cooler than AMD's and have done for a number of years. Apple's ridiculous ongoing spats with the only two viable GPU manufacturers is so inane and childish, ever since about 2003 when AMD accidentally leaked details of a new PowerBook, Apple's been back and forth refusing to deal with one or the other. I also have no idea why Apple is refusing to sign Nvidia's drivers, but it's a pretty low blow, especially since Nvidia support cards going back 7 or 8 years and often release Mac versions of their PCIe cards, when AMD doesn't.

    Yet again Apple's political position with another company is harming its customers.
    FWIW, MacOS 10.14.5 has native Radeon VII drivers.
    macpluspluschiawatto_cobra
  • Reply 5 of 30
    ericthehalfbeeericthehalfbee Posts: 4,097member
    macronin said:
    Hopefully we get some advance notice about Navi at WWDC, while Tim is previewing the modular Mac Pro...!

    The timing of this seems perfect. If they’re launching in July then I’m sure Apple could have received samples by now for testing.

    I think it would be good for AMD to appear at WWDC talking about some new pro version of Navi in the new Mac Pro.
    watto_cobra
  • Reply 6 of 30
    MplsPMplsP Posts: 1,674member
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    watto_cobra
  • Reply 7 of 30
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    williamlondonrunswithfork
  • Reply 8 of 30
    Mike WuertheleMike Wuerthele Posts: 4,894administrator
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    macpluspluschiawatto_cobra
  • Reply 9 of 30
    macroninmacronin Posts: 1,174member
    MplsP said:
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    All the more reason for Apple to go AMD (Threadripper 3) for the new modular Mac Pro...

    As for the whole Thunderbolt 3 thing, both of the new X570 AM4 motherboards listed on the ASRock website are tagged as "Thunderbolt 3" ready, and ASRock offers a TB3 AIC...

    I really think Apple has been waiting on TB3 support for the AMD platform(s) & Threadripper 3 for the modular Mac Pro...

    Another week & we may know...! ;^p
    watto_cobra
  • Reply 10 of 30
    elijahgelijahg Posts: 1,007member
    elijahg said:
    The chance of Apple using them in anything in the near future? Nil. They use the Pro Vega 48 (as an option) in the top end iMac, whilst offering the 56 and 64 in the iMac Pro, essentially the same chips but progressively more locked down. Probably an artificial limitation to differentiate the iMac Pro. I bought a 2019 27" iMac recently, and I would have much rathered a Nvidia GPU than the gimped Vega 48 I have now.

    It'll also be interesting to see if these new AMD chips are anywhere near as fast as Nvidia's current chips, which are much faster and run much cooler than AMD's and have done for a number of years. Apple's ridiculous ongoing spats with the only two viable GPU manufacturers is so inane and childish, ever since about 2003 when AMD accidentally leaked details of a new PowerBook, Apple's been back and forth refusing to deal with one or the other. I also have no idea why Apple is refusing to sign Nvidia's drivers, but it's a pretty low blow, especially since Nvidia support cards going back 7 or 8 years and often release Mac versions of their PCIe cards, when AMD doesn't.

    Yet again Apple's political position with another company is harming its customers.
    FWIW, MacOS 10.14.5 has native Radeon VII drivers.
    Interesting, maybe it'll be sooner than later then. Perhaps in the fabled Mac Pro. Still have my doubts about AMD vs Nvidia though.

    MplsP said:
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    CPUs are much much more difficult to perform a die-shrink on than GPUs. GPUs are relatively simple compared to CPUs, so optimising for die-shrink is far easier.
    watto_cobra
  • Reply 11 of 30
    elijahgelijahg Posts: 1,007member

    macronin said:
    MplsP said:
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    All the more reason for Apple to go AMD (Threadripper 3) for the new modular Mac Pro...

    As for the whole Thunderbolt 3 thing, both of the new X570 AM4 motherboards listed on the ASRock website are tagged as "Thunderbolt 3" ready, and ASRock offers a TB3 AIC...

    I really think Apple has been waiting on TB3 support for the AMD platform(s) & Threadripper 3 for the modular Mac Pro...

    Another week & we may know...! ;^p
    I think a switch to AMD CPUs would be very unlikely at this stage. I'm not sure if AMD could provide the numbers that Apple needs, and even if they could, Apple's apparent path toward ARM CPUs would mean that a switch to AMD now would be an awkward step and waste of time for developers to optimise for. That said, 95% of things will just run on AMD CPUs when coming from Intel's x86.
  • Reply 12 of 30
    elijahgelijahg Posts: 1,007member

    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    It's not just ray tracing, for general purpose graphics Nvidia is dominating. The top 30 benchmarks on videocardbenchmark.net has AMD appearing just 5 times, and just once in the top 20. Even then, the Radeon VII is still miles behind the Geforce RTX 280Ti.
  • Reply 13 of 30
    Mike WuertheleMike Wuerthele Posts: 4,894administrator
    elijahg said:

    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    It's not just ray tracing, for general purpose graphics Nvidia is dominating. The top 30 benchmarks on videocardbenchmark.net has AMD appearing just 5 times, and just once in the top 20. Even then, the Radeon VII is still miles behind the Geforce RTX 280Ti.
    I'm on record saying that the lack of Nvidia is a problem.

    I'm aware about the general purpose results, but this varies a great deal in regards to workload -- and also not what I was talking about. Ray tracing has loads of potential, but it just isn't a thing right now, and may not be any time soon.
    watto_cobra
  • Reply 14 of 30
    elijahgelijahg Posts: 1,007member
    elijahg said:

    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    It's not just ray tracing, for general purpose graphics Nvidia is dominating. The top 30 benchmarks on videocardbenchmark.net has AMD appearing just 5 times, and just once in the top 20. Even then, the Radeon VII is still miles behind the Geforce RTX 280Ti.
    I'm on record saying that the lack of Nvidia is a problem.

    I'm aware about the general purpose results, but this varies a great deal in regards to workload -- and also not what I was talking about. Ray tracing has loads of potential, but it just isn't a thing right now, and may not be any time soon.
    I didn't mean to insinuate you weren't, and that is absolutely true. AMD's OpenCL support and speed as I'm sure you're aware has been very good historically, which is partly why Apple used two AMD cards in the Trashcan. Seems it was a bit misjudged though, as despite at the time it appearing that we were on the cusp of a compute boom, it turned out that GPU compute is really hard to utilise well with general purpose computing. Also of course the CUDA vs OpenGL incompatibility didn't help either.

    The Radeon Pro cards were really for compute workloads, so they aren't great at more general purpose workloads as shown above, but unfortunately as AMD also seemed to misjudge the market, the Radeon Pro cards now have big chunks of the silicon dedicated to the compute processing which is bogging down the general purpose performance. Perhaps this is partly why Nvidia refuses to support OpenCL, as the hardware support slows the rest of the CPU whereas Nvidia's CUDA doesn't. Navi may resolve this, possibly at the expense of compute performance despite their claims, as obviously only real life results will show.

    But yes, I don't think hardware ray tracing is a particularly good metric to compare by right now.
  • Reply 15 of 30
    esummersesummers Posts: 910member
    elijahg said:
    The chance of Apple using them in anything in the near future? Nil. They use the Pro Vega 48 (as an option) in the top end iMac, whilst offering the 56 and 64 in the iMac Pro, essentially the same chips but progressively more locked down. Probably an artificial limitation to differentiate the iMac Pro. I bought a 2019 27" iMac recently, and I would have much rathered a Nvidia GPU than the gimped Vega 48 I have now.

    It'll also be interesting to see if these new AMD chips are anywhere near as fast as Nvidia's current chips, which are much faster and run much cooler than AMD's and have done for a number of years. Apple's ridiculous ongoing spats with the only two viable GPU manufacturers is so inane and childish, ever since about 2003 when AMD accidentally leaked details of a new PowerBook, Apple's been back and forth refusing to deal with one or the other. I also have no idea why Apple is refusing to sign Nvidia's drivers, but it's a pretty low blow, especially since Nvidia support cards going back 7 or 8 years and often release Mac versions of their PCIe cards, when AMD doesn't.

    Yet again Apple's political position with another company is harming its customers.
    AMD chips are better optimized for compute, so they may be better for some workflows.  The latest AMD Vega can go head-to-head in compute tasks with an nVidia Tesla (the nVidia chip aimed at Compute).  This could be better for some creative apps like Premiere, Photoshop, or FCP X.  This is in part because AMD decided to create one chip for datacenter and gaming.  Gaming performance and realtime graphics is sacrificed somewhat for better compute.  I think AMD may also be a closer fit to Apple's chips making them a better Metal API target.  It is unfortunate that Nvidia has been shut out through.  They are better chips for some workflows such as virtual reality or gaming.  I suspect that if AMDs future 7nm Vega 2 is able to compete with head-to-head with Nvidia realtime performance then they are actually way ahead of Nvidia since AMD just doesn't create a chip with the same target audience.  You would think they would try to make their Navi architecture more suited for realtime graphics and less compute oriented, but it doesn't seem to be the case.  AMD might not have the resources to focus on chip designs that are radically different from each other.  AMD needs their cards to do well with compute since they make so much money in the datacenter.  Surprisingly they still manage to win the gaming console contracts.  Maybe their compute oriented design works better when integrated with a CPU on the same chip.
    edited May 27
  • Reply 16 of 30
    macxpressmacxpress Posts: 4,979member
    Does NVIDIA even make cards/drivers that support Metal? If not then thats a major issue as Apple is going more toward their own Metal graphics engine. In fact, it's required nowadays as of Mojave. I thought I remember reading somewhere that NVIDIA doesn't support Metal at the moment. 
    watto_cobra
  • Reply 17 of 30
    esummersesummers Posts: 910member

    elijahg said:
    I didn't mean to insinuate you weren't, and that is absolutely true. AMD's OpenCL support and speed as I'm sure you're aware has been very good historically, which is partly why Apple used two AMD cards in the Trashcan. Seems it was a bit misjudged though, as despite at the time it appearing that we were on the cusp of a compute boom, it turned out that GPU compute is really hard to utilise well with general purpose computing. Also of course the CUDA vs OpenGL incompatibility didn't help either.

    The Radeon Pro cards were really for compute workloads, so they aren't great at more general purpose workloads as shown above, but unfortunately as AMD also seemed to misjudge the market, the Radeon Pro cards now have big chunks of the silicon dedicated to the compute processing which is bogging down the general purpose performance. Perhaps this is partly why Nvidia refuses to support OpenCL, as the hardware support slows the rest of the CPU whereas Nvidia's CUDA doesn't. Navi may resolve this, possibly at the expense of compute performance despite their claims, as obviously only real life results will show.

    But yes, I don't think hardware ray tracing is a particularly good metric to compare by right now.
    I don't disagree, but it should be noted that OpenCL is on the way out.  It all has to do with Metal compute performance now.  AMD does a good job at this, just like they did with OpenCL.  Arguably this is better for most Apple customers since most creative apps are more compute heavy.  AMD also functions well enough for most games and 3D applications.  However some 3D modelers may prefer higher realtime graphics performance to compute.  Unfortunately most of them have already switched to PC to take advantage of Nvidia and more frequent GPU upgrades since upgradability is currently limited on Macs.  Hopefully Apple will be able to win some of these users back with the new Mac Pro.
    edited May 27 williamlondonwatto_cobra
  • Reply 18 of 30
    esummersesummers Posts: 910member
    macronin said:
    MplsP said:
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    All the more reason for Apple to go AMD (Threadripper 3) for the new modular Mac Pro...

    As for the whole Thunderbolt 3 thing, both of the new X570 AM4 motherboards listed on the ASRock website are tagged as "Thunderbolt 3" ready, and ASRock offers a TB3 AIC...

    I really think Apple has been waiting on TB3 support for the AMD platform(s) & Threadripper 3 for the modular Mac Pro...

    Another week & we may know...! ;^p
    It would certainly be interesting.  Technically Thunderbolt recently became part of the new USB spec, so anyone can implement it now.  I didn't realize AMD boards were already picking up support.  Although this year would be too early, I've wondered if Apple might try a Threadripper-style design with ARM chips.  Take multiple 8-core ARM chips attached to an io chip (T series?) that handles memory and storage.
    edited May 27 watto_cobra
  • Reply 19 of 30
    esummersesummers Posts: 910member

    elijahg said:
    elijahg said:
    The chance of Apple using them in anything in the near future? Nil. They use the Pro Vega 48 (as an option) in the top end iMac, whilst offering the 56 and 64 in the iMac Pro, essentially the same chips but progressively more locked down. Probably an artificial limitation to differentiate the iMac Pro. I bought a 2019 27" iMac recently, and I would have much rathered a Nvidia GPU than the gimped Vega 48 I have now.

    It'll also be interesting to see if these new AMD chips are anywhere near as fast as Nvidia's current chips, which are much faster and run much cooler than AMD's and have done for a number of years. Apple's ridiculous ongoing spats with the only two viable GPU manufacturers is so inane and childish, ever since about 2003 when AMD accidentally leaked details of a new PowerBook, Apple's been back and forth refusing to deal with one or the other. I also have no idea why Apple is refusing to sign Nvidia's drivers, but it's a pretty low blow, especially since Nvidia support cards going back 7 or 8 years and often release Mac versions of their PCIe cards, when AMD doesn't.

    Yet again Apple's political position with another company is harming its customers.
    FWIW, MacOS 10.14.5 has native Radeon VII drivers.
    Interesting, maybe it'll be sooner than later then. Perhaps in the fabled Mac Pro. Still have my doubts about AMD vs Nvidia though.

    MplsP said:
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    CPUs are much much more difficult to perform a die-shrink on than GPUs. GPUs are relatively simple compared to CPUs, so optimising for die-shrink is far easier.

    AMD will be launching their 7nm Ryzen and Threadripper chips soon.  However that has more to do with their fab parters than anything AMD is doing on their own.
    watto_cobra
  • Reply 20 of 30
    esummersesummers Posts: 910member
    macxpress said:
    Does NVIDIA even make cards/drivers that support Metal? If not then thats a major issue as Apple is going more toward their own Metal graphics engine. In fact, it's required nowadays as of Mojave. I thought I remember reading somewhere that NVIDIA doesn't support Metal at the moment. 
    Nvidia has beta drivers.  However they are not very stable and I don't think they have been updated in more then a year.  I'm not exactly sure why they exist.  They may have been hoping to go after the eGPU market or may have been trying to win back Apple business.
    watto_cobra
Sign In or Register to comment.