Thunderbolt 3 for External Radeon via AMD XConnect Tech

Jump to First Reply
Posted:
in Future Apple Hardware
http://anandtech.com/show/10133/amd-xconnect-external-radeons

Thunderbolt 3 external graphics gives me hope to use these systems on Apple.

Whether on a Macbook or a Mac Mini I'd love to see a kext added to OS X to leverage this out of the box as a BTO option for both. Hell, I'd love to see it as a BTO across the board, including iMac to Mac Pro.

Comments

  • Reply 1 of 7
    wizard69wizard69 Posts: 13,377member
    I may be odd man out but I really don't get the desire for these external GPU's.    At least on Mac OS it is just a waste of money.    Cost wise you would be better off installing Windows on a purpose built gaming desktop.


     0Likes 0Dislikes 0Informatives
  • Reply 2 of 7
    hmurchisonhmurchison Posts: 12,449member
    I see this as an essential area.  Apple clearly wants to keep their computers small and the best way to do that without heating issues is high bandwidth external connections.  Plus with 4k and even 8k coming we are going to need as much compute power as possible. 

     0Likes 0Dislikes 0Informatives
  • Reply 3 of 7
    nbhmsnbhms Posts: 9member
    It would also provide a way for Apple to hop on the VR bandwagon.  As it stands, no currently shipping Apple hardware can drive the Oculus.  Even the Mac Pro doesn't meet the minimum spec on the Oculus.  Supporting this would allow Apple to keep their current GPU chipsets as they are, but allow those Apple customers who wanted to buy an external box to drive Virtual Reality setups.

    Or Apple could just give us the midrange small tower model we've been asking for.
     0Likes 0Dislikes 0Informatives
  • Reply 4 of 7
    Marvinmarvin Posts: 15,559moderator
    nbhms said:
    It would also provide a way for Apple to hop on the VR bandwagon.  As it stands, no currently shipping Apple hardware can drive the Oculus.  Even the Mac Pro doesn't meet the minimum spec on the Oculus.  Supporting this would allow Apple to keep their current GPU chipsets as they are, but allow those Apple customers who wanted to buy an external box to drive Virtual Reality setups.

    Or Apple could just give us the midrange small tower model we've been asking for.
    Lower performance hardware can drive the Oculus, the frame rate just wouldn't keep up on high-end games without lowering the graphics quality. Games need to run around 90fps at a high resolution. Star Wars Battlefront is probably a good benchmark to use due to the high quality visuals, high action and heavy outdoor scenery:



    Typical mobile GPUs today will come in around a 950M, which gets above 90FPS on low quality with Star Wars Battlefront. This is a lower resolution, it would need to be about 3x faster for low quality at Oculus resolution and another 3-4x at high/ultra:

    http://www.notebookcheck.net/NVIDIA-GeForce-GTX-950M.138026.0.html

    The requirement is lower for lower quality games. Fifa 16 VR would only need about 1.5x current laptops. If you look back at GPUs years ago, the 2010 laptop GPUs were around 5x faster than the 2005 models. The 2015 models are similarly around 5x the 2010 models. ~25x in 10 years. The GPUs launching this year are aiming for double the performance of existing GPUs, expect 50-100%.

    There will be ways to optimize VR rendering too, console manufacturers will have to do this and are in a better position to promote it. Right now, they have to do it the lazy way of rendering a separate view per eye because the VR manufacturers don't make the games but the game or engine developers can optimize their rendering so that they don't have to compute shadows twice for points that both eyes can see. That wouldn't necessarily half the requirement but it can help a lot. The engine developers would start drawing the left eye and at a particular sample, check if the other eye can see it and do a second faster computation then store that pixel on the right eye's framebuffer without having to recompute everything.

    Interlacing is a way they can improve framerate for lower-end hardware so they just render odd lines on one frame, even on another and it doubles the perceived framerate. These techniques can help lower the minimum requirement.

    The 2016 higher-end 27" iMacs and Mac Pro should be well within the current VR requirements. The Oculus Rift only comes out at the end of this month (it's just developer kits so far), this is pretty new technology.

    Most people buy low-end machines (laptops), this is true of the PC industry too with average selling prices of $500. The external GPUs will allow the ultraportable laptop owners to get the high-end 3D experience without the expense and they'll help with OpenCL apps. For higher-end machines like the iMac and Mac Pro, I don't think it'll be needed much. The VR hardware is expensive so adoption will take a while.

    It's nice to see that AMD is adopting Thunderbolt, they resisted using it to begin with. It opens up the GPU market to far more people than PCIe desktops ever could and it makes them consumer-friendly because you don't have to buy a card to install, you can buy a finished box that you just plug in and people can trade and resell old boxes, maybe even plug in multiple GPUs. It should be possible for Mac Pro owners to plug in up to 6 dedicated GPUs in their own cases for a total of 8 GPUs. It would use a fair bit of electricity (over 2kW) but Bitcoin mining, scientific computing or VFX would get some use out of it.

    If the Mac Mini still had a quad i7 model, coupling that with a TB GPU would be equivalent to a PC gamer box. I was hoping for this sort of thing in 2011:


    AMD is going to have mobile GPU options so the external boxes don't need to be so big. They wouldn't need to be much larger than a power brick and can obviously be integrated with displays. Having sub-$500 options will help get more interest in it. Having the GPU in a display can also fix the display tearing and it allows higher bandwidth for high resolution displays.
     0Likes 0Dislikes 0Informatives
  • Reply 5 of 7
    wizard69wizard69 Posts: 13,377member
    I suspect that the general lack of interest in this thread highlight the general lack of interest at large for external GPUs.    That means if the do come they will be an expensive solution for what you get.  


    As to ballooning performance needs that just means computers will get smaller to keep up.  
     0Likes 0Dislikes 0Informatives
  • Reply 6 of 7
    cnocbuicnocbui Posts: 3,613member
    wizard69 said:
    I suspect that the general lack of interest in this thread highlight the general lack of interest at large for external GPUs.    That means if the do come they will be an expensive solution for what you get.  


    As to ballooning performance needs that just means computers will get smaller to keep up.  
    The lack of interest in this thread simply reflects the AI user base, who if they had an interest in high end graphics, wouldn't be using anything made by Apple and so likely wouldn't be here.  The external GPU option is likely to be more expensive than an entire dedicated gaming PC so it's unlikely to get much traction.
     0Likes 0Dislikes 0Informatives
  • Reply 7 of 7
    Marvinmarvin Posts: 15,559moderator
    cnocbui said:
    wizard69 said:
    I suspect that the general lack of interest in this thread highlight the general lack of interest at large for external GPUs.    That means if the do come they will be an expensive solution for what you get.  

    As to ballooning performance needs that just means computers will get smaller to keep up.  
    The lack of interest in this thread simply reflects the AI user base, who if they had an interest in high end graphics, wouldn't be using anything made by Apple and so likely wouldn't be here.  The external GPU option is likely to be more expensive than an entire dedicated gaming PC so it's unlikely to get much traction.
    There's also a general fatigue in the tech community hearing about things you can't buy yet. As soon as Thunderbolt came out in 2011, it was clear that it could support external GPUs like the ExpressCard slots and it never got support from the manufacturers. Here we are 5 years later and this is the first official support it's getting. Apple would still need to add support for plug-and-play and there's not much reason for them to. They sell the latest GPUs when they update their product line.

    While the cost could be comparable to a dedicated PC, that wouldn't matter for non-gaming activities and some people will want to use just one computer for everything. There are Thunderbolt PCIe boxes selling for ~$200 so that's the premium on top of the GPU card. If a retailer sells both the card and box together then they only need to make a margin on the whole unit.

    I could see display manufacturers using this. AMD could partner up with LG for example to offer a UHD gaming display for laptop owners. They just plug in their laptops via Thunderbolt. It saves some cable clutter because the power supply to the display would power the GPU. You'd just have the TB cable to the display and the power cable from the display. The laptop could either be powered by the display (as it wouldn't be using much power) or have a power cable. It could make the display obsolete more quickly but as long as it had a reasonably high power GPU, it would last 3 years or so and switching out a display isn't a huge issue.

    As internal GPUs get more powerful, these external options become less appealing. If someone bought a low-end laptop ~$1000 or Mac Mini ~$500-600, then a $400-600 GPU for computing/gaming could be ok value for money vs getting a $2k laptop or iMac. People who already buy the higher-end laptops, iMac or Mac Pro wouldn't need it.
    edited March 2016
    cnocbui
     1Like 0Dislikes 0Informatives
Sign In or Register to comment.