AMD Vega 56 and Vega 64 GPUs destined for iMac Pro detailed in Linux driver

Posted:
in Future Apple Hardware edited June 2017
The GPUs that will be in the iMac Pro -- the Vega 56 and Vega 64 -- have been detailed by an AMD-provided driver update for Linux, with the cards able to utilize much as twice as a much data in each register as previous cards when 32 bits of precision aren't needed.




The data was collected from the flagship graphic chipset's drivers for Linux's Direct Rendering Manager, and collated by enthusiast site Wccftech and others.

The Vega Pro 56 has 56 compute units, with 3584 stream processors, and 8GB of HBM2 RAM pushing 400 GB/s of data. While single-precision (FP32) calculations are 11 tflop roughly analagous to the Nvidia GTX 1080ti , half-precision 16-bit calculations (FP16) such as that used for image and graphic processing, ray tracing, artificial intelligence, and game rendering hit a peak of 22 tflops.
Actual task-related benchmarking after iMac Pro release will ultimately tell the tale.
A configurable upgrade for the iMac Pro, the Vega Pro 64, has 64 compute units, with 4096 stream processors, and 16GB of HBM2 RAM -- but no declared bandwidth as of yet. The gleaned information concludes that the Vega Pro 64 has FP32 single-precision calculations at 13 tflop, with FP16 at 25 tflop.

Pricing on either chipset has not yet been announced, or can be determined from the drivers, obviously.

Not just for the iMac Pro

A a pair of Vega-powered PCI-E cards were also announced in conjunction with the Vega Pro 56 and Pro 64 chipsets. The Radeon RX Vega Frontier edition is in essence a Radeon Vega Pro 64 in a PCi-E card, and the RX Vega gaming card has very few specifications known at this time. It is unclear if macOS will see drivers for either card.

Pre-orders for the RX Vega Frontier card have started for either $1200 for an air-cooled version or $1800 for a water-cooled one, but how accurate these prices are is not yet known. The card intended for workstations, with a similar market segment to the iMac Pro, and is scheduled to ship on June 27, according to AMD CEO Lisa Su.




The official launch of the line beyond marketing information, or information gleaned from previously accurate driver updates is expected at the SIGGRAPH trade show in July.

Explaining FP16 and FP32

The Nvidia GTX 1080ti has around 11 tflop of FP32 performance, but it is short on FP16 cores with one per streaming multiprocessor -- and FP32 cores can't do FP16 calculations. Previous testing with eGPUs have listed cards performances when doing FP32 calculations, for simplicity.

The advantages of FP16 for the consumer aren't quite clear, or easily comparable, to Nvidia cards as of yet. The Vega card with its new architecture doubles FP16 performance, giving calculations that rely on it a big boost over older generations of Radeon cards, and Nvidia ones.

Some modern Nvidia cards have equivalent FP16 performance to FP32, and some have notably poorer FP16 performance than FP32. The new Nvidia 1080ti card, often used in builds attempting to match the iMac Pro by enthusiasts, delivers less than a teraflop of FP16 performance.

The Nintendo Switch and PlayStation 4 Pro operating systems take full advantage of FP16 calculations to boost performance when possible, as does Metal when developers choose to implement it.

Geometry and heavy scientific calculations are better performed on FP32, and the even more complex FP64, for the higher precision allowed. Dynamic lighting and image editing will see a boost with FP16, as will gaming, and machine learning tasks.

Real-world tasks will be a mix of FP16 and FP32. Actual task-related benchmarking after iMac Pro release will ultimately tell the tale.

Grain of salt

Previous versions of cards, including the Nvidia 10-series and Radeon 500-series cards had specifications leak from vendor-supplied drivers for Linux.

However, the possibility remains that the numbers aren't accurate. But, given the announced RX Vega Frontier cards, it seems fairly likely to be correct.

Comments

  • Reply 1 of 20
    pbruttopbrutto Posts: 30member
    I really hope that, given the close ties of Apple with AMD that they are going to release at least 1 model of these (or, more importantly to me, upcoming gaming version of these) with USB-C/TB3 port(s). I made the mistake of getting the LG ultra fine 4k and while it looks STUNNING, there is 0 compatibility with either my windows machine or any graphics card save the as yet unrelated GTX1080TI X11G gaming card announced by MSI but not yet shipping. 
    edited June 2017 watto_cobra
  • Reply 2 of 20
    wizard69wizard69 Posts: 13,377member
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  
  • Reply 3 of 20
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    wizard69 said:
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  
    Given the price of the PCI-E version, I don't see a mini replacement with this card anytime soon.
    tycho_macuser
  • Reply 4 of 20
    smiffy31smiffy31 Posts: 202member
    wizard69 said:
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  
    Given the price of the PCI-E version, I don't see a mini replacement with this card anytime soon.
    A kaby lake upgrade to the mini would be quite major under the circumstances.
    watto_cobra
  • Reply 5 of 20
    macxpressmacxpress Posts: 5,801member
    wizard69 said:
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  
    Given the price of the PCI-E version, I don't see a mini replacement with this card anytime soon.
    The mini is meant to be a lower end consumer Mac anyways...I never see the Mac mini getting anything other than an Intel integrated graphics chip. I guess you never know, but at the price point for the Mac mini, I just don't see it. If it were say a $1200 Mac desktop maybe, but not for $500-700. 
  • Reply 6 of 20
    A MacMini with that sort of GPU? No. Apple would not do something like that. Apple will continue to milk the MacMini with lame components and maintain a high profit margin. Apple leaves me dumbfounded. The most powerful MacMinis can't do 4K video and yet dirt-cheap Android boxes can. Refurbished MacMinis sell out so quickly on the Apple store, so there must be some demand yet Apple had to see fit that MacMinis aren't even user-upgradeable. Pathetic. Apple seem to be throwing away so many opportunities just so they can focus on the iPhone.
    mike54williamlondon
  • Reply 7 of 20
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    smiffy31 said:
    wizard69 said:
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  
    Given the price of the PCI-E version, I don't see a mini replacement with this card anytime soon.
    A kaby lake upgrade to the mini would be quite major under the circumstances.
    And probable at some point, given Apple's given stance on the Mac mini as "important" in April.
  • Reply 8 of 20
    williamlondonwilliamlondon Posts: 1,324member
    Apple seem to be throwing away so many opportunities just so they can focus on the iPhone.
    You've got it all figured out, haven't you, I'm surprised Tim's not knocking on your door right now to hand you the keys to the company. ;-)
    MacPromacxpressmagman1979watto_cobra
  • Reply 9 of 20
    chasmchasm Posts: 3,275member
    People complaining about the cost of the iMac Pro can perhaps stand down now, given that the video card appears to make up nearly a third of the total cost of the base model. Yow.
    williamlondonmagman1979watto_cobra
  • Reply 10 of 20
    johnbearjohnbear Posts: 160member
    That's great news, plan to build a powerhouse hackintosh soon and can't wait for the cards to be available as well as software support from apple
    edited June 2017
  • Reply 11 of 20
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    johnbear said:
    That's great news, plan to build a powerhouse hackintosh soon and can't wait for the cards to be available as well as software support from apple
    You'll be interested in an imminent story, then. Stay tuned for a bit later this afternoon.
  • Reply 12 of 20
    mdriftmeyermdriftmeyer Posts: 7,503member
    wizard69 said:
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  

    It ties in with LLVM/Clang/ with AMD's /ROCm/VR/AR  and Apple's own OpenCL stack/AR/VR and both Machine Learning interests, to much more on the Server end with Naples.

    https://github.com/RadeonOpenCompute

    http://gpuopen.com/

    http://radeon.com/en-us/


    Add in an external GPU stack of Radeon Instinct GPGPUs for Vega and Apple becomes the leader in many fields necessary for a complete Workstation environment with Apple's platform.

    http://instinct.radeon.com/en-us/






  • Reply 13 of 20
    wizard69wizard69 Posts: 13,377member
    macxpress said:
    wizard69 said:
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  
    Given the price of the PCI-E version, I don't see a mini replacement with this card anytime soon.
    The mini is meant to be a lower end consumer Mac anyways...I never see the Mac mini getting anything other than an Intel integrated graphics chip. I guess you never know, but at the price point for the Mac mini, I just don't see it. If it were say a $1200 Mac desktop maybe, but not for $500-700. 
    Actually the Mini gets sold to many professionals as a low cost works station without the attached screen you get with the laptop.     Many an app developer has gone the Mini route.    I never really understood the Mini = consumer Mac point as it doesn't reflect where sales go.   

    As for what I was talking about that is the rumored Mini replacement, one variant which must support an discreet chip video card for Apple to really demonstrate that they are serious about AR, ML and other technologies demanding high performance compute.    So yeah I'm expecting that one variant of the Mini will cost in the order of $1200 to $1500 dollars.   A good portion of that cost going to cover the video chip.   By the way this may or may not mean that the GPU chip is integrated into the mother board.    The other possibility would be a cut down Mac Pro with one video card and a more desktop oriented CPU chip.   A Mac Pro lite if you will.

    I'm only thinking this way due to eh heavy emphasis Apple has placed on AR, ML and other advanced technologies that really need the compute resources a discreet chip offers.    From my stand point it looks like Apple is all in with respect to Machine Learning as that is being worked into a lot of Apples technologies.   
  • Reply 14 of 20
    wizard69wizard69 Posts: 13,377member
    I agree with a lot about what you have said below.
    A MacMini with that sort of GPU? No. Apple would not do something like that. Apple will continue to milk the MacMini with lame components and maintain a high profit margin. Apple leaves me dumbfounded. The most powerful MacMinis can't do 4K video and yet dirt-cheap Android boxes can. Refurbished MacMinis sell out so quickly on the Apple store, so there must be some demand yet Apple had to see fit that MacMinis aren't even user-upgradeable. Pathetic. Apple seem to be throwing away so many opportunities just so they can focus on the iPhone.
    The problem is the Mini will become more or less useless if it doesn't support Apples AR and ML technologies in the future.   While integrated GPU's get better every year they are still a long ways from being highly optimized for these technologies.    As for Vega I see that as a requirement in a top end Mini replacement the lower end models can have anything from integrated GPU's to some midrange chip.    

    As you note the current Mini's are pretty pathetic.   I think Apple realizes this and is actually in a mad rush to overhaul the entire Mac desktop lineup.    The fact is the Mini isn't really capable of supporting the very technologies they debuted at WWDC this year.   Obviously that has to be address or they will look pretty stupid come the time for a shipping High Sierra.    In other words the Mini will get a complete overhaul, possibly into a completely different looking machine, in time to support the new tech in High Sierra.

    As for Mini demand, contrary to popular opinion here it is a popular machine for people with low end professional requirements.   The Mini is extremely popular with app developers for example, I've seen it used as an embedded controller is some really high end instrumentation also.   One thing the Mini has had going for it is that it is a very stable platform that remains compact over the years.   Of course adding a discreet GPU capability will have a negative impact on its usefulness as a stable platform.   I really think that people underestimate the good qualities that the Mini has that keeps it selling in a very very down market.    My problem with it is that I need more out of a desktop but I don't need the hit to the pocket that a Mac Pro causes.
  • Reply 15 of 20
    wizard69wizard69 Posts: 13,377member

    Apple seem to be throwing away so many opportunities just so they can focus on the iPhone.
    You've got it all figured out, haven't you, I'm surprised Tim's not knocking on your door right now to hand you the keys to the company. ;-)
    Tim Cook wouldn't be the first executive to screw up understanding customer needs.    I live across the river from the Old Kodak Park and can attest to just how badly a companies leadership can screw up not understanding markets and customer needs.   Right now Apple is living off iPhone but that can go as quickly as it came, it would be nice to see Apple produce computers that its customers actually want and have a need for.   Oh an maybe put a bit of effort into marketing those computers as complete systems, the MacPro shows that they have loss all concept of what a pro system is.
    williamlondon
  • Reply 16 of 20
    wizard69wizard69 Posts: 13,377member

    wizard69 said:
    This ought to highlight to people why Apple has stayed with AMD GPUs for so long.   AMD is simply a better choice for Apples initiatives in AR and ML.   

    Hopefully we will see a Mac Mini replacement with these cards.  Either that or a Mac Pro more focused on the desktop workstation market.  

    It ties in with LLVM/Clang/ with AMD's /ROCm/VR/AR  and Apple's own OpenCL stack/AR/VR and both Machine Learning interests, to much more on the Server end with Naples.

    https://github.com/RadeonOpenCompute

    http://gpuopen.com/

    http://radeon.com/en-us/


    Add in an external GPU stack of Radeon Instinct GPGPUs for Vega and Apple becomes the leader in many fields necessary for a complete Workstation environment with Apple's platform.

    http://instinct.radeon.com/en-us/






    Now the question is, as we wait for Apple new desktops to debut, is does Apple get it.    That is do they understand what people want these day out of a desktop machine, a small workstation or a high performance workstation.    My  feeling here is that Mac Pro would have been an ideal small workstation computer given a better pricing structure and a greater focus on what many pro's outside of content creation need.   In other words the Mac Pro is a good machine targeted at the wrong people and maybe just a bit overboard for the people that could put it to use.       

    In any event I'm trying to drive home the idea that Apple will need to be far more proactive when it comes to performance GPU's if they really want their new AR and ML technologies to take off.   This is why I believe the Mac Minis replacement will have support for external GPU's in at east one and most likely two variants.   Either that or the Mac Pro becomes Apple midrange desktop and it gets replaced with a real performance workstation.     A Mac Pro chassis with a recent Intel APU style chip and one good GPU card would be an awesome machine for many of us.    Sell if for $1500 - $2000  and it will be a hot seller 

    In the end the Mac Pro is an example of Apple being to focused on one subset of its customer base and then not even getting that right.   They completely missed that many of us only need one GPU card, a high performance one at that.   Mac Pro really makes you question if they really understand their customer base.   Mind you I'm one that think many concepts embodied in the Mac Pro really do represent the future of desktop computing.    There is no reason for the massive chassis of the past for one.    Second the way technology changes these days it doesn't make sense to buy a machine with the thought of upgrading every part in the machine as technology changes too much by the time new hardware makes an upgrade worthwhile.    At best a RAM upgrade and maybe a storage upgrade can make sense, but these day people are fooling themselves by over valuing complete machine upgradability.
  • Reply 17 of 20
    wizard69 said:
    Apple seem to be throwing away so many opportunities just so they can focus on the iPhone.
    You've got it all figured out, haven't you, I'm surprised Tim's not knocking on your door right now to hand you the keys to the company. ;-)
    Tim Cook wouldn't be the first executive to screw up understanding customer needs.    I live across the river from the Old Kodak Park and can attest to just how badly a companies leadership can screw up not understanding markets and customer needs.   Right now Apple is living off iPhone but that can go as quickly as it came, it would be nice to see Apple produce computers that its customers actually want and have a need for.   Oh an maybe put a bit of effort into marketing those computers as complete systems, the MacPro shows that they have loss all concept of what a pro system is.
    Yeah, Kodak really screwed up their position big time didn't they?! But, just because one big company did it doesn't mean Apple will, or is currently screwing up (to which some attest). In fact it seems just the opposite, who else is doing better? Micro$haft and their monopoly they couldn't extend into phones because they aren't a product company but rather a (copy/steal/acquire) monopoly-business company? Or Intel with their too-hot-to-touch high-wattage chips performing the same as some of Apple's which fit in a phone? How about Samesung and their, "we can't come up with an idea unless we read about it on an Apple rumors blog?" Of course there is always Google, "we love selling your privacy and as long as psychological manipulation is legal and people want information, we're gold!?"

    Seems Apple is trying to expand out of their iPhone dependence, I'm sure they know better than you and I the risk of being overly dependent on the one product, but they're not gods and sometimes expectations from bloggers far exceed the reality that tech progress can deliver.

    Of course none of this matters to some, for those with unreasonable expectations if Tim can't pull an actual rabbit out of the proverbial hat then he's a failure and it's because he's not listening to the best ideas, those which are contained within the minds of Joe and Cindy Nobody. The most interesting ideas from those two morons (Joe & Cindy) include 1) more machines in a non-growth or dying market (i.e. desktop PCs), or 2) a better (Pro) machine in an infinitesimally small market. It's a laughable assessment some insist will work but the armchair CEO position is just as easy to perform as armchair Quarterback, when the reality of these positions is something a bit more complex.
    edited June 2017 watto_cobra
  • Reply 18 of 20

    Speaking of the iMac Pro, I thought it would only be the Edition Watch that I cannot afford! These iMac Pros are really some beasts!

    Apple makes the line between "want" and "need" very blurry. Sometimes the price is what separates the men from the boys.

    I'm crawling back to my play-pen now!

  • Reply 19 of 20
    dwalladwalla Posts: 15member
    iMac Pro will be great for 3D artists. For anyone else in the visual professional world, not so much. After Effects performs much, much better on high clock speeds than it does on mass multi ores. We worked with BareFeats several months back when we discovered our iMacs were butchering our Mac Pros and new Mac Pros at render speeds. 
  • Reply 20 of 20
    MarvinMarvin Posts: 15,310moderator
    dwalla said:
    iMac Pro will be great for 3D artists. For anyone else in the visual professional world, not so much. After Effects performs much, much better on high clock speeds than it does on mass multi ores. We worked with BareFeats several months back when we discovered our iMacs were butchering our Mac Pros and new Mac Pros at render speeds. 
    It depends on how you setup the software. You can do multi-processing in After Effects that isn't enabled by default. Video apps process separate frames so these can be done in parallel. The UI setup in AE isn't very stable but you can do it yourself via the command-line. This also saves memory as you can quit AE and flush out all the caches. There's a page here about render times using multi-frame processing, they removed some options in newer versions:

    https://www.pugetsystems.com/labs/articles/Adobe-After-Effects-CC-2015-Multi-Core-Performance-714/

    To do this manually, use the binary called aerender that is inside the base AE folder. Example commands are here:

    https://helpx.adobe.com/after-effects/using/automated-rendering-network-rendering.html

    Say that you had a project called Animation.ae on the desktop and you had a comp inside it called Overlays. Assume you want to use Best render settings and an Output Module called PNG Sequence. The template names are just the names in the drop-downs. Normally you just do the whole sequence but this has to wait for each frame to finish one after the other. Instead, you split the render into multiple blocks of frames and do them in parallel. The -s and -e below are start and end frames.

    For a comparison, first run a sequence in a single pass in a single command-line, change the names and folder locations to match your own setup, you can drag files into the command-line and use tab for auto-complete:

    /Applications/After Effects CC/aerender -project ~/Desktop/Animation.ae -comp "Overlays" -RStemplate "Best Settings" -OMtemplate "PNG Sequence" -s 1 -e 100 -output ~/Desktop/animation[####].png

    Then try splitting the task into 4 blocks in 4 separate command-lines, just copy/paste into each window and adjust the numbers, it's best to do the changes in a text editor then copy/paste into the command-line:

    /Applications/After Effects CC/aerender -project ~/Desktop/Animation.ae -comp "Overlays" -RStemplate "Best Settings" -OMtemplate "PNG Sequence" -s 1 -e 24 -output ~/Desktop/animation[####].png
    /Applications/After Effects CC/aerender -project ~/Desktop/Animation.ae -comp "Overlays" -RStemplate "Best Settings" -OMtemplate "PNG Sequence" -s 25 -e 49 -output ~/Desktop/animation[####].png
    /Applications/After Effects CC/aerender -project ~/Desktop/Animation.ae -comp "Overlays" -RStemplate "Best Settings" -OMtemplate "PNG Sequence" -s 50 -e 74 -output ~/Desktop/animation[####].png
    /Applications/After Effects CC/aerender -project ~/Desktop/Animation.ae -comp "Overlays" -RStemplate "Best Settings" -OMtemplate "PNG Sequence" -s 75 -e 100 -output ~/Desktop/animation[####].png

    The latter 4 processes should be much faster than the single pass. You have to be mindful of memory usage because each process allocates its own memory and not all parts of a sequence are equally intensive so choose the splits accordingly, you can experiment with which amount gives the best results. Adobe should really have a UI for aerender to allow you to quit AE and spool up multiple processes for a project.
    edited June 2017 williamlondonbestkeptsecret
Sign In or Register to comment.