Testing an AMD WX 9100 eGPU with the 15-inch i9 MacBook Pro

Posted:
in Current Mac Hardware edited August 2018
Apple's eGPU support extends beyond consumer-grade cards. AppleInsider takes a look at the AMD Radeon Pro WX 9100 GPU workstation card, inside a Sonnet eGFX 650 Breakaway Box.





The WX 9100 is a high-end professional workstation card that supports ECC memory is intended for large computing jobs or a high volume of them in a zero-fault environment.

To support this goal, the WX 9100 comes with 16GB of HBM2 memory, 4096 stream processors and is based on AMD's latest 12nm Vega architecture. For outputs, the card has six Mini-DisplayPort 1.4 connections supporting 10-bit output.




All of these features add to a graphics card costing $1500 which will definitely seem high for the consumer market, now that the BitCoin squeeze on pricing is over. But, to those needing that kind of computational heft, it's not a bad buy even when you add the eGPU enclosure.

We'll be comparing the WX 9100 and Sonnet combo to the best graphics card available in our 2018 15-inch i9 MacBook Pro, the 4GB Radeon Pro 560X. We'll also be connecting that machine to the AMD 580 Pro Blackmagic eGPU which houses a Radeon Pro 580 graphics chip with 8GB of memory.



We used the Set-eGPU script to force all applications to use the eGPU, but this shouldn't be required with MacOS Mojave.

Starting off with Geekbench 4's OpenCL test, the WX 9100 scored more than double that of our MacBooks internal graphics and a bit higher than Blackmagic's unit.

Doing a quick benchmark run in Unigine's Heaven benchmark which tests gaming graphics performance, the WX 9100 once again performs more than twice as good as the 560X and about 45% faster than the Blackmagic eGPU.

MacBook Pro 560XBlackmagic 580 eGPUWX 9100 eGPU


GB4 OpenCL52,499110,423131,102
Unigine Heaven FPS21.136.351.6
Unigine Heaven Score5329151,300
#custom-table{border-collapse:collapse;border-spacing:0;width:100%}#custom-table th{font-size:1.1em}#custom-table th.vert{-webkit-writing-mode:vertical-lr;writing-mode:vertical-lr}#custom-table td,#custom-table th{padding:.5em}#custom-table td{border:1px solid #ddd;border-top:0;text-align:center}#custom-table th{border-bottom:1px solid #000;vertical-align:bottom}#custom-table tr td:nth-child(1){background-color:#ebebeb;font-weight:700;text-align:left}#custom-table-wrap{min-width:660px;overflow-x:scroll}

Moving on to video editing tasks, we first ran the Bruce X benchmark for Final Cut Pro which mostly taxes the graphics cards.

Running the Final Cut X stabilization filter on a 20-second 4K video clip, the WX 9100 eGPU performed the task in just 7 seconds, compared to 13 with both the integrated 560X and Blackmagic's eGPU.




In Davinci Resolve, this task went from 28 seconds to 17 seconds with Black Magics eGPU and then to 14 seconds.

MacBook Pro 560XBlackmagic 580 eGPUWX 9100 eGPU


Bruce X - FCX0:460:300:22
FCX Stabilization0:130:130:07
Resolve Stabilization0:280:170:14
#custom-table{border-collapse:collapse;border-spacing:0;width:100%}#custom-table th{font-size:1.1em}#custom-table th.vert{-webkit-writing-mode:vertical-lr;writing-mode:vertical-lr}#custom-table td,#custom-table th{padding:.5em}#custom-table td{border:1px solid #ddd;border-top:0;text-align:center}#custom-table th{border-bottom:1px solid #000;vertical-align:bottom}#custom-table tr td:nth-child(1){background-color:#ebebeb;font-weight:700;text-align:left}#custom-table-wrap{min-width:660px;overflow-x:scroll}

We do video editing tasks daily, obviously -- a task that many are looking to improve with an eGPU. Starting with rendering a 5 minute 4K project using standard h.264 footage with color correction and effects, we didn't see any improvement in Final Cut X, with the Blackmagic eGPU actually slowing it down.

We suspect that the highest-end graphics available in the 2018 15-inch MacBook Pro is already fast enough at rendering this footage with effects to the point that CPU isn't limited. On top of that, there is some bandwidth constraints made worse by having to send the data through Thunderbolt 3 not only to the GPU to be rendered, but also back to our software.

MacBook Pro 560XBlackmagic 580 eGPUWX 9100 eGPU

H.264 4K - FCX3:444:063:46
H.264 4K - Resolve4:535:568:24

#custom-table{border-collapse:collapse;border-spacing:0;width:100%}#custom-table th{font-size:1.1em}#custom-table th.vert{-webkit-writing-mode:vertical-lr;writing-mode:vertical-lr}#custom-table td,#custom-table th{padding:.5em}#custom-table td{border:1px solid #ddd;border-top:0;text-align:center}#custom-table th{border-bottom:1px solid #000;vertical-align:bottom}#custom-table tr td:nth-child(1){background-color:#ebebeb;font-weight:700;text-align:left}#custom-table-wrap{min-width:660px;overflow-x:scroll}

As we move on to much tougher codecs like Canon Cinema RAW Lite where internal graphics are a huge bottleneck, we see some major improvement even with a short one-minute timeline. Not only did our render go from almost four times longer than the project itself to less than half that time, but our timeline performance went from and unusable 20 frames per second, to 55 frames per second which is a massive improvement.

In Davinci Resolve, we went from 20 minutes with the internal 560X, to 15 minutes with the Blackmagic eGPU, and just 8 minutes with the WX 9100.

MacBook Pro 560XBlackmagic 580 eGPUWX 9100 eGPU



Canon 4K 60 RAW - FCX3:422:171:41
FCX Playback FPS203655
Canon 4K 60 RAW - Resolve20:0215:168:09
Resolve Playback FPS273245

#custom-table{border-collapse:collapse;border-spacing:0;width:100%}#custom-table th{font-size:1.1em}#custom-table th.vert{-webkit-writing-mode:vertical-lr;writing-mode:vertical-lr}#custom-table td,#custom-table th{padding:.5em}#custom-table td{border:1px solid #ddd;border-top:0;text-align:center}#custom-table th{border-bottom:1px solid #000;vertical-align:bottom}#custom-table tr td:nth-child(1){background-color:#ebebeb;font-weight:700;text-align:left}#custom-table-wrap{min-width:660px;overflow-x:scroll}

Taking a look at a 1 minute 4.5K RED RAW project with color correction and effects, we don't see much change in Final Cut Pro because our CPU is the bottleneck. Davinci Resolve uses the graphics cards more so we see about 60 percent faster speeds with the WX 9100.

Applying noise reduction is often needed with RAW footage, and it usually makes anything but the highest end computers like a higher spec iMac Pro choke. With temporal noise reduction added in Resolve, our WX 9100 finished the job in nearly one-third of the time than the 560X and the timeline had very few dropped frames making it workable where the MacBook wasn't.

MacBook Pro 560XBlackmagic 580 eGPUWX 9100 eGPU


4.5K Red RAW - FCX2:222:182:05
4.5K Red RAW - Resolve1:141:190:46
Resolve - 4.5K Noise Reduction5:46Not tested2:16

#custom-table{border-collapse:collapse;border-spacing:0;width:100%}#custom-table th{font-size:1.1em}#custom-table th.vert{-webkit-writing-mode:vertical-lr;writing-mode:vertical-lr}#custom-table td,#custom-table th{padding:.5em}#custom-table td{border:1px solid #ddd;border-top:0;text-align:center}#custom-table th{border-bottom:1px solid #000;vertical-align:bottom}#custom-table tr td:nth-child(1){background-color:#ebebeb;font-weight:700;text-align:left}#custom-table-wrap{min-width:660px;overflow-x:scroll}

To finish off this first round of testing, we used Blender to test the 3D Rendering capabilities of the Radeon Pro WX 9100 vs the internal Radeon Pro 560X. We used the 1225 BMW test Project available on Blender's website and used the graphics cards to render. Here, the WX 9100 completed the task at just 5 minutes vs close to 27 minutes on the MacBook Pro alone.

MacBook Pro 560XWX 9100 eGPU
Blender BMW GPU Render (1225)26:485:02

#custom-table{border-collapse:collapse;border-spacing:0;width:100%}#custom-table th{font-size:1.1em}#custom-table th.vert{-webkit-writing-mode:vertical-lr;writing-mode:vertical-lr}#custom-table td,#custom-table th{padding:.5em}#custom-table td{border:1px solid #ddd;border-top:0;text-align:center}#custom-table th{border-bottom:1px solid #000;vertical-align:bottom}#custom-table tr td:nth-child(1){background-color:#ebebeb;font-weight:700;text-align:left}#custom-table-wrap{min-width:660px;overflow-x:scroll}

In Conclusion, the WX 9100 is a very powerful card that can greatly speed some professional tasks. On top of that, tasks that require dual precision GPU's and ECC memory are now possible, without purchasing a dedicated workstation computer capable of this.

We at AppleInsider are very excited about the possibilities that eGPU's offer and are looking forward to even better support in MacOS Mojave.

Where to buy

AMD's Radeon Pro WX 9100 Graphics Card is currently available from third-party sellers on Amazon for $1,449.99. B&H also has the video card in stock for $1,499.99 with free expedited shipping and no tax collected outside New York and New Jersey*.

Both Amazon and B&H also carry Sonnet's eGFX 650 Breakaway Box for $399.00 with free shipping.
«1

Comments

  • Reply 1 of 23
    This card is essentially the Radeon Vega 64 that shows up in the higher-end iMac Pro (there may be modest differences in clock speed, etc.). The iMac Pro version has 16 GB of RAM, just like the WX 9100. Unfortunately, Vega 64s available separately have 8 GB. Unless you really need the 16 GB for a very RAM hungry task, or you have an application that uses the WX series workstation drivers, a standard $600 Vega 64 in the same eGPU box should perform similarly. I'm not sure that the certified drivers issue even exists on Macs? On the PC side, the GPU manufacturers write more stable drives that enable specific features in CAD software and a few other pro applications (not including Photoshop and friends) that work only with the "workstation" variants of cards. Many of the Radeon WX and NVidia Quadro cards are just the gaming GPU, sometimes actually downclocked a bit, sometimes with more RAM or some other minor variation, sold for 2-3x the price with a minor firmware tweak that enables the workstation driver. At the high end, there are workstation cards that use different GPUs.

    Unless you know you need the certified workstation driver for some particular application, just use a Vega 64...
    bb-15p-dogracerhomie3fastasleepwatto_cobra
  • Reply 2 of 23
    okssipinokssipin Posts: 1unconfirmed, member
    Apple should offer both AMD and nVidia graphics, and let people choose.
    It is shortsighted their alliance with AMD, ignoring more choices with NVidia.

  • Reply 3 of 23
    cpsrocpsro Posts: 3,198member
    As a "scientist", I like to see tables clearly labeled, which these tables aren't. It would also be nice to know in each experiment the real-time duration of the video clip tested, so that the render time can be compared to real-time.
    StrangeDaysjeffharrisbloggerblogdanhpscooter63hypoluxa
  • Reply 4 of 23
    racerhomie3racerhomie3 Posts: 1,264member
    okssipin said:
    Apple should offer both AMD and nVidia graphics, and let people choose.
    It is shortsighted their alliance with AMD, ignoring more choices with NVidia.

    Nvidia want too much control. They want to be the Apple of the GPU world. Apple had worked with them previously, but now their visions collide.
  • Reply 5 of 23
    okssipin said:
    Apple should offer both AMD and nVidia graphics, and let people choose.
    It is shortsighted their alliance with AMD, ignoring more choices with NVidia.

    Apple don't like being forced to use Proprietary software like Adobe's Flash or Nvidia's Cuda. Neither do they like being implicated in Class action lawsuits through fault of their manufacturing partners http://forum.notebookreview.com/threads/nvidia-reaches-settlement-in-class-action-suit-affecting-apple-dell-hp-laptops.521913/
    Nividia also likes to dictate higher margins for supplying their products .That's why they failed to keep both Xbox & PlayStation! 
    Rayz2016watto_cobra
  • Reply 6 of 23
    netroxnetrox Posts: 1,421member
    0VERL0RD said:
    okssipin said:
    Apple should offer both AMD and nVidia graphics, and let people choose.
    It is shortsighted their alliance with AMD, ignoring more choices with NVidia.

    Apple don't like being forced to use Proprietary software like Adobe's Flash or Nvidia's Cuda. Neither do they like being implicated in Class action lawsuits through fault of their manufacturing partners http://forum.notebookreview.com/threads/nvidia-reaches-settlement-in-class-action-suit-affecting-apple-dell-hp-laptops.521913/
    Nividia also likes to dictate higher margins for supplying their products .That's why they failed to keep both Xbox & PlayStation! 
    And Metal is not proprietary?
  • Reply 7 of 23
    nunzynunzy Posts: 662member
    I much prefer these external boxes connected with wires. Internal expansion cards are too clunky.
  • Reply 8 of 23
    cgWerkscgWerks Posts: 2,952member
    0VERL0RD said:
    Apple don't like being forced to use Proprietary software like Adobe's Flash or Nvidia's Cuda. Neither do they like being implicated in Class action lawsuits through fault of their manufacturing partners http://forum.notebookreview.com/threads/nvidia-reaches-settlement-in-class-action-suit-affecting-apple-dell-hp-laptops.521913/
    Nividia also likes to dictate higher margins for supplying their products .That's why they failed to keep both Xbox & PlayStation! 
    Well, that might explain why Apple doesn't buy from nVidia to build them into their machines. But, it doesn't really explain why they don't enable users to toss one in an eGPU and make it work without hacking the OS.
  • Reply 9 of 23
    cgWerks said:
    0VERL0RD said:
    Apple don't like being forced to use Proprietary software like Adobe's Flash or Nvidia's Cuda. Neither do they like being implicated in Class action lawsuits through fault of their manufacturing partners http://forum.notebookreview.com/threads/nvidia-reaches-settlement-in-class-action-suit-affecting-apple-dell-hp-laptops.521913/
    Nividia also likes to dictate higher margins for supplying their products .That's why they failed to keep both Xbox & PlayStation! 
    Well, that might explain why Apple doesn't buy from nVidia to build them into their machines. But, it doesn't really explain why they don't enable users to toss one in an eGPU and make it work without hacking the OS.
    Nvidia makes there chips compatible with macs, that’s not really something Apple can help.
    watto_cobra
  • Reply 10 of 23
    fastasleepfastasleep Posts: 6,417member
    nunzy said:
    I much prefer these external boxes connected with wires. Internal expansion cards are too clunky.
    Yeah, especially considering there are zero Macs on the market that take internal graphics cards, it is "clunky" to get them in there.

    Are you a bot? You have to tell us if you are.
    nunzywatto_cobra
  • Reply 11 of 23
    cgWerkscgWerks Posts: 2,952member
    cgWerks said:
    Well, that might explain why Apple doesn't buy from nVidia to build them into their machines. But, it doesn't really explain why they don't enable users to toss one in an eGPU and make it work without hacking the OS.
    Nvidia makes there chips compatible with macs, that’s not really something Apple can help.
    I guess what I mean, is why does Apple go out of their way to make it more difficult to make them work? Because the hold a grudge?
  • Reply 12 of 23
    quadra 610quadra 610 Posts: 6,757member
    cgWerks said:
    cgWerks said:
    Well, that might explain why Apple doesn't buy from nVidia to build them into their machines. But, it doesn't really explain why they don't enable users to toss one in an eGPU and make it work without hacking the OS.
    Nvidia makes there chips compatible with macs, that’s not really something Apple can help.
    I guess what I mean, is why does Apple go out of their way to make it more difficult to make them work? Because the hold a grudge?


    Grudges in business are rare don’t make a lot of practical sense.

    The reason probably has to do separately or with some combination of money, licensing, compatibility, etc. 
    StrangeDayswatto_cobra
  • Reply 13 of 23
    macxpressmacxpress Posts: 5,808member
    It also doesn't help that NVIDIA kept leaking information about new Macs. I believe this is what happened with the last Mac with an NVIDIA GPU. If you want to partner with someone, especially Apple you have to keep your mouth shut until it's officially released. 
    edited August 2018 entropyspscooter63cgWerkswatto_cobra
  • Reply 14 of 23
    tallest skiltallest skil Posts: 43,388member
    0VERL0RD said:
    Neither do they like being implicated in Class action lawsuits through fault of their manufacturing partners http://forum.notebookreview.com/threads/nvidia-reaches-settlement-in-class-action-suit-affecting-apple-dell-hp-laptops.521913/
    We should have gotten new machines for that fuck-up, by the way.
    cgWerkswatto_cobra
  • Reply 15 of 23
    mdriftmeyermdriftmeyer Posts: 7,503member
    Resolve isn't designed to utilize eGPU.
    watto_cobra
  • Reply 16 of 23
    nVidia refuses to give Apple the information necessary to make Metal work well with their GPU's. They are trying very hard to push CUDA, which is exclusive to nVidia. That's why Apple led the way with OpenCL support (which is vendor agnostic), and now Metal, with which they are able to support both Mac and iOS devices.

    nVidia wants CUDA to be the only GPU language in town, which is understandable.

    Apple wants to be able to choose any technology they want to make their devices better, whether nVidia, AMD or internally developed GPU's.

    nVidia has managed their account with Apple poorly, often revealing Apple's confidential information, hiding flaws in their devices until it's too late to replace them, outright lying about schedule slips and refusing to correct security issues while simultaneously refusing to provide adequate information for Apple to fix those issues independently. Then there's appropriating Apple technology for their own devices/contending over patents and copyrights.

    nVidia makes great products; AMD isn't in the same league, honestly. So Apple's only play is to stick with AMD until their internal offerings surpass both while meeting their needs.

    Apple will always choose to control its own destiny rather than trust another company.

    Intel is an even worse partner, not that Apple is always completely innocent, but you would be amazed at the stuff these companies try to get away with... to the extent that Tim has had to call BK (former Intel CEO) and call foul.

    The Mac platform will move much faster once Apple has jettisoned their "partners" like Intel, Qualcomm and nVidia, but they aren't going to rock any of those boats until and unless they have proven internal development ready to deliver superior replacements. Steve might have gone thermonuclear over any number of situations, but Tim is Spock-like in deciding when "enough is enough."

    h4y3sMacProGeorgeBMaccgWerksmacxpressRayz2016watto_cobrafastasleephypoluxa
  • Reply 17 of 23
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    Resolve isn't designed to utilize eGPU.
    Yes it is. You just have to explicitly set it under High Sierra using set-egpu.
    cgWerkswatto_cobra
  • Reply 18 of 23
    Wibble69Wibble69 Posts: 3unconfirmed, member
    Hey guys,

    Great review, but could we, you know, kinda keep it in the real world with something that gives gamers an idea of performance boost? 

    I know that many people are horrified that I would spend this sort of money on a mac gaming laptop, but I live in an environment where a laptop is my only realistic computer - when I am at home, I want a gaming rig, on the road, I need something that I can render like a pro. 

    In civ 6 can I play the game on max resolution? Does a decent graphics card take the pressure off the internal graphics card enough that the i9 is able to run at a faster speed for longer? 


    Just sayin...
    StrangeDayswatto_cobra
  • Reply 19 of 23
    Mike WuertheleMike Wuerthele Posts: 6,861administrator
    Wibble69 said:
    Hey guys,

    Great review, but could we, you know, kinda keep it in the real world with something that gives gamers an idea of performance boost? 

    I know that many people are horrified that I would spend this sort of money on a mac gaming laptop, but I live in an environment where a laptop is my only realistic computer - when I am at home, I want a gaming rig, on the road, I need something that I can render like a pro. 

    In civ 6 can I play the game on max resolution? Does a decent graphics card take the pressure off the internal graphics card enough that the i9 is able to run at a faster speed for longer? 


    Just sayin...
    This is a $1500 workstation card, with its own market. We've done loads of other coverage on eGPUs.

    edited August 2018 watto_cobra
  • Reply 20 of 23
    Great addition to speed up internet surfing and watching streaming content!  16GB HBM2 with a 4096 makes the WX 9100 graphics card the right choice.
Sign In or Register to comment.