AMD launches RX 5000-series graphics cards with 7nm Navi GPUs

2»

Comments

  • Reply 21 of 30
    Mike WuertheleMike Wuerthele Posts: 6,553administrator
    esummers said:
    macxpress said:
    Does NVIDIA even make cards/drivers that support Metal? If not then thats a major issue as Apple is going more toward their own Metal graphics engine. In fact, it's required nowadays as of Mojave. I thought I remember reading somewhere that NVIDIA doesn't support Metal at the moment. 
    Nvidia has beta drivers.  However they are not very stable and I don't think they have been updated in more then a year.  I'm not exactly sure why they exist.  They may have been hoping to go after the eGPU market or may have been trying to win back Apple business.
    There are zero Nvidia drivers for Mojave.
    macxpress
  • Reply 22 of 30
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    Nvidia in 3D is a must, GPU rendering with Nvidia has much broader support, for example, Arnold (rendering engine) only supports GPU rendering with the latest Nvidia cards. I'm anxiously waiting to see what the new Mac Pro will support, it'll be ridiculous if only AMD cards are supported. If they do go this route I'll have to unfortunately build a windows box, ugh that would be sadness. 
    watto_cobra
  • Reply 23 of 30
    wizard69wizard69 Posts: 13,377member
    macronin said:
    MplsP said:
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    All the more reason for Apple to go AMD (Threadripper 3) for the new modular Mac Pro...

    As for the whole Thunderbolt 3 thing, both of the new X570 AM4 motherboards listed on the ASRock website are tagged as "Thunderbolt 3" ready, and ASRock offers a TB3 AIC...

    I really think Apple has been waiting on TB3 support for the AMD platform(s) & Threadripper 3 for the modular Mac Pro...

    Another week & we may know...! ;^p
    A beautiful dream!   AMD in the last couple of years has made significant advances while Intel has gone backwards.  The playing field has been leveled for the most part.  

    For one one thing I’d love to see more Zen2 based hardware from Apple.  These are excellent processors for a number of Apple platforms.   With AMDs range and pricing Apple could lower prices ad raise performance on a number of platforms.  
  • Reply 24 of 30
    wizard69wizard69 Posts: 13,377member
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    Nvidia in 3D is a must, GPU rendering with Nvidia has much broader support, for example, Arnold (rendering engine) only supports GPU rendering with the latest Nvidia cards. I'm anxiously waiting to see what the new Mac Pro will support, it'll be ridiculous if only AMD cards are supported. If they do go this route I'll have to unfortunately build a windows box, ugh that would be sadness. 
    Sometimes software is the driving factor in a PC purchase.  However I really don’t see NVidia as a must here.  

    Beyond that why why would you dismiss an AMD GPU before it even launches?     It really looks like AMD has done wonders with Navi.  This should result in across the board improvements on Apple hardware.   That is if Apple ever used this release.   There in is the big problem, Apple simply is too cheap with respect to chip selection on most of its platforms.  
  • Reply 25 of 30
    macroninmacronin Posts: 1,174member
    wizard69 said:
    macronin said:
    MplsP said:
    Interesting that AMD is using a 7nm process while Intel is struggling to get a 10nm process for its processors. 
    All the more reason for Apple to go AMD (Threadripper 3) for the new modular Mac Pro...

    As for the whole Thunderbolt 3 thing, both of the new X570 AM4 motherboards listed on the ASRock website are tagged as "Thunderbolt 3" ready, and ASRock offers a TB3 AIC...

    I really think Apple has been waiting on TB3 support for the AMD platform(s) & Threadripper 3 for the modular Mac Pro...

    Another week & we may know...! ;^p
    A beautiful dream!   AMD in the last couple of years has made significant advances while Intel has gone backwards.  The playing field has been leveled for the most part.  

    For one one thing I’d love to see more Zen2 based hardware from Apple.  These are excellent processors for a number of Apple platforms.   With AMDs range and pricing Apple could lower prices ad raise performance on a number of platforms.  
    An AM4 socket-based Mac would be great, use the ASRock TB3 ITX board & you have the xMac...!
    watto_cobra
  • Reply 26 of 30
    madanmadan Posts: 103member
    kruegdude said:
    It would be interesting to really know what’s keeping the NVIDIA drivers from being approved. I’ve read all the linked articles and the articles linked from those articles and it’s a lot of inference based on information from sources in engineering that say they don’t know who or why the decision to exclude the drivers has been made in upper levels of Apple management plus some historical information about the problems Apple has had with their use of NVIDIA chips.  There’s also a vague statement from NVIDIA saying that Apple is not approving their drivers but that doesn’t say anything. 
    This is a myth.  if you frequent Nvidia's developer exchange forums, individuals contacted both Apple and Nvidia concerning the drivers.  It turns out that Apple hasn't denied any drivers.  What's happened is that Nvidia hasn't submitted drivers.  Only a small number of individuals are using eGPUs and, therefore, leveraging Nvidia cards within their Apple computers.  Even fewer are using hackintoshed systems.  The truth is that Nvidia doesn't give a damn about those users.

    There is also an inherent animosity between Apple and Nvidia stemming from Jobs decision to move away from Nvidia after the 8-series mobility part fiasco in the Macbook Pros.  Secondly, Apple has pretty much stated their intention to completely deprecate Open CL and avoid other compute technologies like CUDA.  They're also ditching Open GL completely and were sluggish to provide APIs that could allow Metal to connect to AMD's Mantle (not that it matters now since Mantle is dead according to AMD themselves).  

    What this all means is that for pure compute, as well as for graphics performance APIs, Apple is "eating its own dogfood" .  They've gone whole hog into their own propriety compute and Metal 2 graphics implementations, leaving CUDA out in the cold.  This obviously is a burr in Nvidia's side, worsening an already poisoned relationship.

    However, the point stands.  If you want to blame someone for no Nvidia drivers, blame Nvidia.  They haven't really submitted any to Apple that can support Turing.  This is eminently provable since Kepler and Pascal cards were supported during High Sierra but after a move to Mojave, a bunch of cards stopped working properly and started crashing in productivity apps.


    fastasleepwatto_cobra
  • Reply 27 of 30
    madanmadan Posts: 103member
    There is a second myth that AMD is doing terribly in performance.

    Some important tidbits.
    A. Depending on the metric or tool, Nvidia may drastically outperform AMD.  However, you could just as easily select a tool like Luxmark where an AMD Radeon 7, with Catalysts 19.41 obliterates a DOUBLE PRICE 2080 Ti.

    The truth is, on average, for most tasks, a Vega 56 with the newest drivers and mild undervolt/overclock will match a reference 1080.  A Vega 64 with NO modifications is about 5% faster than a 1080, on average.  A Radeon 7, is about 5-7% slower than a 2080, for about 50-100 USD less.  However, the V7 only has 2 full driver revisions, vs Turing's 7 full driver revisions.  It's obvious that AMD has a lot of room to grow, especially for a GPU that has 60 CUs, 64 ROPs and an absolutely ridiculous 1 TB of bandwidth and 16 GB of HBM2.

    People need to stop pushing propaganda that Nvidia is outperforming AMD on GPUs across the board.  The fact is this is a lie.  They may, in games, on average, be 5-10% more powerful and they may sip less power and be slightly quieter (10 dB), but they also, on average, cost about 5-10% more.  And the fact that, even today, GCN is better at compute due to generalized CU vs purposed Turing cores means that Vegas and even Polaris will eventually outperform similar Nvidia cards with drivers and time.

    We have seen this over and over again.  6970? Started slower than the 780.  Now it practically trades with the 780 Ti. 7970? Radeon FuryX?
    Radeon 580? 5% slower than the 1060 at launch.  Now almost 10% faster on most games.
    Radeon 570?  Now faster than a 1650, despite it's "newer architecture".  Definitively faster than a 1060 3GB, despite costing 25% *less on launch*.

    The Vega 64 started 5-10% slower than a 1080 and now, without even an underclock, on average, a Vega 64 is faster by about 5% on most titles.  And it absolutely destroys the 1080 on 80% of compute tasks. And that's even with the benefit of CUDA vs OpenCL.

    B.  For production, development and, yes, even gaming, sinking 800 dollars into a card with 8GB of GDDR6x is just dumb.  We've had this stupid circle jerk about graphics memory for YEARS now.  The result is that, while it doesn't matter usually the year it comes out, it always invariably matters eventually.  Cards with 2-3 GB of RAM are useless today, even if their core architecture is comparable to midrange cards.  Their performance most certainly is *not*.  

    Pointing to one synth benchmark is equally dumb.  There are already AMD fanboy Youtube channels getting 10-15% additional performance out of V7s and that's with only G2 drivers.  Navi this year is going to drop a Vega 48-type card, with better bandwidth, thermals, at a 1.25%+ IPC increase.  That puts it squarely in the Vega 64 category.  That type of card would be only 5% slower than a 2070.  If they can drop that card at 350-400 USD, Nvidia is dead in the water.

    And yes, I'd take a Radeon V7 over a 2080 any day of the year.  I know...I have one.

    macroninfastasleepwatto_cobra
  • Reply 28 of 30
    xsmixsmi Posts: 138member
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    Games Mike. Don't you know that the ONLY thing that GPU's are good for are games? The things that the AMD cards are better at (compute beasts) don't matter.
  • Reply 29 of 30
    KITAKITA Posts: 382member
    xsmi said:
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    Games Mike. Don't you know that the ONLY thing that GPU's are good for are games? The things that the AMD cards are better at (compute beasts) don't matter.
    NVIDIA:

    The first software providers debuting acceleration with NVIDIA RTX technology in their 2019 releases include:

    • Adobe Dimension & Substance Designer
    • Autodesk Arnold & VRED
    • Chaos Group V-Ray
    • Dassault Systèmes CATIALive Rendering & SOLIDWORKS Visualize 2019
    • Daz 3D Daz Studio
    • Enscape Enscape3D
    • Epic Games Unreal Engine 4.22
    • ESI Group IC.IDO 13.0
    • Foundry Modo
    • Isotropix Clarisse 4.0
    • Luxion KeyShot 9
    • OTOY Octane 2019.2
    • Pixar Renderman XPU
    • Redshift Renderer 3.0
    • Siemens NX Ray Traced Studio
    • Unity Technologies Unity (2020)
  • Reply 30 of 30
    madanmadan Posts: 103member
    KITA said:
    xsmi said:
    The new Mac Pro announcement at WWDC this year could be a train wreck if Apple says "You can use an GPU you want, as long as it is made by AMD." It is hard to ignore the 5x speed increase that comes from hardware ray tracing on NVIDIA RTX GPUs.
    On what application?

    I agree, in 2021 or so, this may be a big deal. It isn't today.
    Games Mike. Don't you know that the ONLY thing that GPU's are good for are games? The things that the AMD cards are better at (compute beasts) don't matter.
    NVIDIA:

    The first software providers debuting acceleration with NVIDIA RTX technology in their 2019 releases include:

    • Adobe Dimension & Substance Designer
    • Autodesk Arnold & VRED
    • Chaos Group V-Ray
    • Dassault Systèmes CATIALive Rendering & SOLIDWORKS Visualize 2019
    • Daz 3D Daz Studio
    • Enscape Enscape3D
    • Epic Games Unreal Engine 4.22
    • ESI Group IC.IDO 13.0
    • Foundry Modo
    • Isotropix Clarisse 4.0
    • Luxion KeyShot 9
    • OTOY Octane 2019.2
    • Pixar Renderman XPU
    • Redshift Renderer 3.0
    • Siemens NX Ray Traced Studio
    • Unity Technologies Unity (2020)
    And yet Nvidia can't be bothered to release drivers for Mojave.

    So the point stands.  Nvidia doesn't care about Mac users.


    watto_cobra
Sign In or Register to comment.