AMD debuts $479 Radeon RX 6700 XT aimed at 1440p workflows

Posted:
in General Discussion
As expected, AMD has announced the Radeon RX 6700 XT, a more affordable $479 GPU that could eventually be used in the Mac Pro or Thunderbolt eGPU solutions.

Credit: AMD
Credit: AMD


The graphics card sports 12GB of VRAM and 40 compute units, and is aimed at providing maximum settings for 1440p PC gaming experiences. AMD says it can provide "up to 2X higher performance in select titles" than the current base of older generation graphics cards.

AMD says users will be able to fit 212 frames-per-second on "Overwatch," 272 fps in "League of Legends," and 360 fps in "Rainbow Six Siege." At 1440p, the company says the RX 6700 XT should be fast enough to support ray tracing. It will require 8-pin and 6-pin power connectors, which the Mac Pro can have, and most eGPU enclosures have.

The RX 6700 XT features the same two-fan design as the RX 6800 and RX 6800 XT. It'll also be available from third-party providers, likely in three-fan setups.

Overall, it's a mid-range entry to the AMD RNDA 2 graphics card lineup. It competes with NVIDIA cards like the RTX 3060 Ti and RTX 3070, which are priced between $400 and $500, respectively. AMD's own specifications suggest that it pulls ahead of both of those cards in some games.

The Radeon RX 6700 XT will launch on March 18 from AMD's board partners, 40 different system builders, and the company's own website. Prebuilt systems, like the HP Omen 25L and 30L, will launch later in the spring.

Given that Nvidia cards are no longer supported, it's likely that the card could be a future upgrade option for the Mac Pro or Mac users with Thunderbolt 3 eGPUs. Drivers for the RX 6000-series don't yet exist in macOS, but given historical support trends, they are expected to arrive shortly.

Comments

  • Reply 1 of 4
    keithwkeithw Posts: 53member
    Do you expect the non-support of eGPUs with M1 Macs (or their successors) to continue, or are they just taking a while to re-write the drivers so they work on the M1 architecture?

    watto_cobra
  • Reply 2 of 4
    Yay!  Another product which, due to silicon shortages and scalpers, will sell out within seconds and thereafter won't be available at anywhere near the MSRP.
    watto_cobra
  • Reply 3 of 4
    cloudguycloudguy Posts: 323member
    keithw said:
    Do you expect the non-support of eGPUs with M1 Macs (or their successors) to continue, or are they just taking a while to re-write the drivers so they work on the M1 architecture?

    No. Or to put it another way the lucrative discrete GPU - well not really THAT lucrative as the only players are Nvidia and AMD and it is a secondary revenue source for both - will be the equivalent of the iPod 10 years from now and a shell of what it now is within 5. 

    The only reason why discrete GPUs were ever needed in the first place was because adding that graphics capability to the CPU used too much power and heat. It would have made devices too big and expensive for the 99% of the population that didn't need those graphics in the first place. But now that integrated circuits can be made at 5nm - and TSMC plans to be at 2nm by 2024 - you'll be able to put the equivalent of an Intel Core i9 and an AMD Radeon RX 6800 in the same SOC and will barely need a fan. 

    Intel and AMD know this and are planning on it. Intel has branded their CPU/GPU combos as XPUs, AMD goes with APU. They do this to distinguish between their previous type Iris/HD integrated GPUs that were significantly worse than discrete GPUs to their modern (or soon to come) integrated GPUs like Xe, which performs within the ballpark of the NVIDIA GeForce MX350 (which is meant more for Adobe Lightroom and Premiere Pro than for Monster Hunter World). Intel and AMD would benefit from this: they would get to charge more. It would particularly benefit AMD, because right now their best CPUs get paired with Nvidia GPUs. Nvidia for their part knows that their business model is going the way of the laptop DVD player in 5-10 years, so they are pivoting towards using their GPUs for edge, IoT, data center and AI hardware instead. 

    So long story short: no. By 2023 - and possibly before then - the integrated GPUs in Apple Silicon will be roughly on par with anything you can get in an eGPU anyway (especially since it isn't exactly Apple Silicon Macs are going to be this AAA gaming with ray tracing platform ... the integrated GPUs will be for content creators, 8K video editors, 3D animators, CAD, architecture etc.) And by 2026 the eGPU market won't exist beyond the Windows hobbyist market at all. 

    Samsung's next flagship mobile SOC - it will definitely power their Windows 10 Galaxy Book 2-in-1, some of their Android tablets and "may" be in the Galaxy Fold 3 - will have an AMD GPU made on their 5nm process that will have better graphics performance than some of the discrete GPUs that they sell for gaming laptops. AMD will probably sell those GPUs to Samsung and MediaTek to replace the PC discrete GPU market that they know will be gone soon. If Qualcomm is forced to ditch their own Adreno GPUs and adopt AMD's it will be hilarious. Even worse: Nvidia had a shot at dominating this space because back in 2014 they had the best mobile GPUs in the business (they were the first to get to 4K and could both achieve legitimate 4K and use AI to upscale 1080p and 2K to 4K) but totally blew it. Now AMD is going to take their place.
  • Reply 4 of 4
    mocsegmocseg Posts: 86member
    cloudguy said:
    keithw said:
    Do you expect the non-support of eGPUs with M1 Macs (or their successors) to continue, or are they just taking a while to re-write the drivers so they work on the M1 architecture?

    only players are Nvidia and AMD and it is a secondary revenue source for both -


    Nvidia, gaming Q4 $2.50b.
    It is not a secondary, but a primary by far.


Sign In or Register to comment.