Thunderbolt 3 for External Radeon via AMD XConnect Tech
http://anandtech.com/show/10133/amd-xconnect-external-radeons
Thunderbolt 3 external graphics gives me hope to use these systems on Apple.
Whether on a Macbook or a Mac Mini I'd love to see a kext added to OS X to leverage this out of the box as a BTO option for both. Hell, I'd love to see it as a BTO across the board, including iMac to Mac Pro.
Thunderbolt 3 external graphics gives me hope to use these systems on Apple.
Whether on a Macbook or a Mac Mini I'd love to see a kext added to OS X to leverage this out of the box as a BTO option for both. Hell, I'd love to see it as a BTO across the board, including iMac to Mac Pro.
Comments
Or Apple could just give us the midrange small tower model we've been asking for.
Typical mobile GPUs today will come in around a 950M, which gets above 90FPS on low quality with Star Wars Battlefront. This is a lower resolution, it would need to be about 3x faster for low quality at Oculus resolution and another 3-4x at high/ultra:
http://www.notebookcheck.net/NVIDIA-GeForce-GTX-950M.138026.0.html
The requirement is lower for lower quality games. Fifa 16 VR would only need about 1.5x current laptops. If you look back at GPUs years ago, the 2010 laptop GPUs were around 5x faster than the 2005 models. The 2015 models are similarly around 5x the 2010 models. ~25x in 10 years. The GPUs launching this year are aiming for double the performance of existing GPUs, expect 50-100%.
There will be ways to optimize VR rendering too, console manufacturers will have to do this and are in a better position to promote it. Right now, they have to do it the lazy way of rendering a separate view per eye because the VR manufacturers don't make the games but the game or engine developers can optimize their rendering so that they don't have to compute shadows twice for points that both eyes can see. That wouldn't necessarily half the requirement but it can help a lot. The engine developers would start drawing the left eye and at a particular sample, check if the other eye can see it and do a second faster computation then store that pixel on the right eye's framebuffer without having to recompute everything.
Interlacing is a way they can improve framerate for lower-end hardware so they just render odd lines on one frame, even on another and it doubles the perceived framerate. These techniques can help lower the minimum requirement.
The 2016 higher-end 27" iMacs and Mac Pro should be well within the current VR requirements. The Oculus Rift only comes out at the end of this month (it's just developer kits so far), this is pretty new technology.
Most people buy low-end machines (laptops), this is true of the PC industry too with average selling prices of $500. The external GPUs will allow the ultraportable laptop owners to get the high-end 3D experience without the expense and they'll help with OpenCL apps. For higher-end machines like the iMac and Mac Pro, I don't think it'll be needed much. The VR hardware is expensive so adoption will take a while.
It's nice to see that AMD is adopting Thunderbolt, they resisted using it to begin with. It opens up the GPU market to far more people than PCIe desktops ever could and it makes them consumer-friendly because you don't have to buy a card to install, you can buy a finished box that you just plug in and people can trade and resell old boxes, maybe even plug in multiple GPUs. It should be possible for Mac Pro owners to plug in up to 6 dedicated GPUs in their own cases for a total of 8 GPUs. It would use a fair bit of electricity (over 2kW) but Bitcoin mining, scientific computing or VFX would get some use out of it.
If the Mac Mini still had a quad i7 model, coupling that with a TB GPU would be equivalent to a PC gamer box. I was hoping for this sort of thing in 2011:
AMD is going to have mobile GPU options so the external boxes don't need to be so big. They wouldn't need to be much larger than a power brick and can obviously be integrated with displays. Having sub-$500 options will help get more interest in it. Having the GPU in a display can also fix the display tearing and it allows higher bandwidth for high resolution displays.
As to ballooning performance needs that just means computers will get smaller to keep up.
While the cost could be comparable to a dedicated PC, that wouldn't matter for non-gaming activities and some people will want to use just one computer for everything. There are Thunderbolt PCIe boxes selling for ~$200 so that's the premium on top of the GPU card. If a retailer sells both the card and box together then they only need to make a margin on the whole unit.
I could see display manufacturers using this. AMD could partner up with LG for example to offer a UHD gaming display for laptop owners. They just plug in their laptops via Thunderbolt. It saves some cable clutter because the power supply to the display would power the GPU. You'd just have the TB cable to the display and the power cable from the display. The laptop could either be powered by the display (as it wouldn't be using much power) or have a power cable. It could make the display obsolete more quickly but as long as it had a reasonably high power GPU, it would last 3 years or so and switching out a display isn't a huge issue.
As internal GPUs get more powerful, these external options become less appealing. If someone bought a low-end laptop ~$1000 or Mac Mini ~$500-600, then a $400-600 GPU for computing/gaming could be ok value for money vs getting a $2k laptop or iMac. People who already buy the higher-end laptops, iMac or Mac Pro wouldn't need it.