Nvidia's Turing GPU architecture includes ray-tracing cores, 8K video playback support

Posted:
in General Discussion
Nvidia has revealed its new Turing architecture for use in GPUs, one which boasts dedicated cores for use in ray tracing, and while consumer cards are yet to be announced, Quadro RTX Turing GPUs are already being touted for their ability to support real-time 8K video playback.




The Turing architecture, declared by the company as the "greatest leap since the invention of the CUDA GPU" from 2006, gives a considerable boost to ray-tracing, a technique for rendering that is capable of lighting a scene realistically, but at a high processing cost. As part of Turing, Nvidia has added "RT Cores" used to accelerate ray tracing of scenes.

At the same time, new Tensor Cores are also included for AI processing, including neural network training and inference. By combining the two, the Turing GPUs are said to be able to render scenes requiring ray-tracing up to 25 times faster than Pascal-based cards.

The new Turing Streaming Multiprocessor architecture offers up to 4,608 CUDA cores, and is capable of up to 16 trillion floating point operations in parallel with 16 trillion integer operations per second. It is also the first implementation of new memory from Samsung, using 16GB GDDR6 memory modules.

As with other earlier graphics cards, it can connect to a secondary card using NVLink, with the connection allowing video memory to scale up to a capacity of 96GB, and up to 100GB per second data transfers.

On the back, the cards add support for USB Type-C connections and VirtualLink, an open standard under development that could allow VR headsets to run off a single cable. VirtualLink seeks to combine power, two-way data transfers, and video feeds through one slim cable and connection, rather than the multi-cable system used in current-generation VR headsets.

Revealed at SIGGRAPH, the Quadro RTX 8000, 6000, and 5000 professional GPUs are meant for use in industries wight demanding visual computing workloads, such as video editing, visual effects for TV and movies, and design fields.




Nvidia worked with high-resolution cinema camera producer RED to make it possible for video producers to work with 8K-resolution video without bottlenecks, with promises of being able to offer greater than 24 frames per second video running in real time using a single-processor computer and one Quatro RTX GPU.

Under current cards, video editors working in 8K have to view the footage at a reduced resolution, if they wish to see their video without dropped frames or stuttering. In the case of Red, the processing of video from its cameras could be offloaded to the Turing GPU, allowing for it to be viewed at full resolution without stutters or downsampling for a preview beforehand.

The Quadro RTX 8000 is the highest-specification card, with 48GB of memory, 4,608 CUDA cores, 576 Tensor cores, and the ability to process ray-tracing scenes at 10 GigaRays per second. The Quadro RTX 6000 has half the memory, but is said to have the same number of CUDA cores, Tensor cores, and ray-tracing processing capabilities.

The lowest specification card, the RTX 5000, has 16GB of memory, 3,072 CUDA Cores, 384 Tensor cores, and can produce 6 GigaRays per second.

While the announcements so far have been about Turing's use in a professional setting, Nvidia is expected to provide some details about its consumer-focused releases using the architecture soon, most likely during its Gamescom event on August 20.

Few details about cards Nvidia will launch have been released but WCCFTech notes a teaser video for the GeForce RTX Series graphics cards hints it will include a card called the GeForce RTX 2080. Specifications are unknown, but it is likely to include the same ray-tracing capabilities as the Quadro RTX range.

Mac users who want to take advantage of Nvidia's new architecture will be capable of doing so using an eGPU enclosure, but with some effort. Support for AMD's GPUs in external enclosures are included natively in macOS High Sierra, but while Nvidia cards don't have the same support, they can be made to work using some community-produced scripts and tools.
«1

Comments

  • Reply 1 of 22
    so.. found this on another site..

    "Usernames like RoyTeX (RTX), Not_11, Mac-20, and Eight Tee all seem to spell out RTX 2080, a potential name for the new Nvidia card."

    any hope that "Mac-20" might refer to a Mac based card coming??

    heres to hoping
    watto_cobra
  • Reply 2 of 22
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    lilred505 said:
    so.. found this on another site..

    "Usernames like RoyTeX (RTX), Not_11, Mac-20, and Eight Tee all seem to spell out RTX 2080, a potential name for the new Nvidia card."

    any hope that "Mac-20" might refer to a Mac based card coming??

    heres to hoping
    Well, the Nvidia line prior to this one works now with the web driver package -- assuming you've got a Mac with PCI-E slots.
    edited August 2018
  • Reply 3 of 22
    lilred505 said:
    so.. found this on another site..

    "Usernames like RoyTeX (RTX), Not_11, Mac-20, and Eight Tee all seem to spell out RTX 2080, a potential name for the new Nvidia card."

    any hope that "Mac-20" might refer to a Mac based card coming??

    heres to hoping
    Well, the Nvidia line prior to this one works now with the web driver package -- assuming you've got a Mac with PCI-E slots.
    Works isn't completely true.. if I can't use my system as normal, upgrade the os etc... then I don't consider it really working. Too many what ifs currently.  I am currently running a 1080ti in my cMP. So I understand all the limitations.
    What I am hoping for is native GPU support with boot support in the firmware of the card.  No flashing needed, like the 680. Also, Plug one into an EGPU and it just works..  I understand this is also up to apple.  We can hope apple has slapped Nvidia enough and has come back to their senses and use the superior GPU.
    watto_cobra
  • Reply 4 of 22
    I have a mid 2010 mac pro with a gtx 1070 pc card in it with nvidia web drivers and it works flawlessly. Two points, you dont see the Apple boot loader on screen in the beginning and you need to be running 10.12 or 10.13 for this to work.
    watto_cobra
  • Reply 5 of 22
    cpenzonecpenzone Posts: 114member
    Does anyone know if this would help accelerate the ray tracing feature in Photoshop 3D? Like, I mean, significantly?  It's so slow... I've tried iMac, Mac Pro, and a Macbook Pro and nothing seems to really improve it significantly. 

    I know there are other faster 3D programs but we are an Adobe shop and for whatever reason I just understand how their tool works. I've tried the others and just can't get past the learning curve.
    watto_cobra
  • Reply 6 of 22
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    lilred505 said:
    lilred505 said:
    so.. found this on another site..

    "Usernames like RoyTeX (RTX), Not_11, Mac-20, and Eight Tee all seem to spell out RTX 2080, a potential name for the new Nvidia card."

    any hope that "Mac-20" might refer to a Mac based card coming??

    heres to hoping
    Well, the Nvidia line prior to this one works now with the web driver package -- assuming you've got a Mac with PCI-E slots.
    Works isn't completely true.. if I can't use my system as normal, upgrade the os etc... then I don't consider it really working. Too many what ifs currently.  I am currently running a 1080ti in my cMP. So I understand all the limitations.
    What I am hoping for is native GPU support with boot support in the firmware of the card.  No flashing needed, like the 680. Also, Plug one into an EGPU and it just works..  I understand this is also up to apple.  We can hope apple has slapped Nvidia enough and has come back to their senses and use the superior GPU.
    Oh, I see what you're saying. Yeah, while you CAN get a Nvidia card in an eGPU, it is a pain.
    watto_cobra
  • Reply 7 of 22
    tipootipoo Posts: 1,141member
    A Turing GPU would certainly be one way the Mac Pro could differentiate itself from the iMac Pro. If Apple and Nvidia got over their battle for control over the universe. 
    lilred505watto_cobra
  • Reply 8 of 22
    lilred505 said:
    lilred505 said:
    so.. found this on another site..

    "Usernames like RoyTeX (RTX), Not_11, Mac-20, and Eight Tee all seem to spell out RTX 2080, a potential name for the new Nvidia card."

    any hope that "Mac-20" might refer to a Mac based card coming??

    heres to hoping
    Well, the Nvidia line prior to this one works now with the web driver package -- assuming you've got a Mac with PCI-E slots.
    Works isn't completely true.. if I can't use my system as normal, upgrade the os etc... then I don't consider it really working. Too many what ifs currently.  I am currently running a 1080ti in my cMP. So I understand all the limitations.
    What I am hoping for is native GPU support with boot support in the firmware of the card.  No flashing needed, like the 680. Also, Plug one into an EGPU and it just works..  I understand this is also up to apple.  We can hope apple has slapped Nvidia enough and has come back to their senses and use the superior GPU.
    Oh, I see what you're saying. Yeah, while you CAN get a Nvidia card in an eGPU, it is a pain.
    Its not a pain, all you have to do is make sure the nvidia webdriver is available before doing a security update. Again this is in a Mac Pro model
    edited August 2018 watto_cobra
  • Reply 9 of 22
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    jmey267 said:
    lilred505 said:
    lilred505 said:
    so.. found this on another site..

    "Usernames like RoyTeX (RTX), Not_11, Mac-20, and Eight Tee all seem to spell out RTX 2080, a potential name for the new Nvidia card."

    any hope that "Mac-20" might refer to a Mac based card coming??

    heres to hoping
    Well, the Nvidia line prior to this one works now with the web driver package -- assuming you've got a Mac with PCI-E slots.
    Works isn't completely true.. if I can't use my system as normal, upgrade the os etc... then I don't consider it really working. Too many what ifs currently.  I am currently running a 1080ti in my cMP. So I understand all the limitations.
    What I am hoping for is native GPU support with boot support in the firmware of the card.  No flashing needed, like the 680. Also, Plug one into an EGPU and it just works..  I understand this is also up to apple.  We can hope apple has slapped Nvidia enough and has come back to their senses and use the superior GPU.
    Oh, I see what you're saying. Yeah, while you CAN get a Nvidia card in an eGPU, it is a pain.
    Its not a pain, all you have to do is make sure the nvidia webdriver is available before doing a security update. Again this is in a Mac Pro model
    "Yeah, while you CAN get a Nvidia card in an eGPU, it is a pain."

    PCI-E Mac Pro - not a pain. eGPU - pain.
    edited August 2018 watto_cobra
  • Reply 10 of 22
    mcdavemcdave Posts: 1,927member
    cpenzone said:
    Does anyone know if this would help accelerate the ray tracing feature in Photoshop 3D? Like, I mean, significantly?  It's so slow... I've tried iMac, Mac Pro, and a Macbook Pro and nothing seems to really improve it significantly. 

    I know there are other faster 3D programs but we are an Adobe shop and for whatever reason I just understand how their tool works. I've tried the others and just can't get past the learning curve.
    Can’t or won’t? Sounds like you need to change your staff.
  • Reply 11 of 22
    mcdavemcdave Posts: 1,927member
    Good on Nvidia - RayTracing is the nirvana graphics tech.  If Apple have any play in this area they’d better get it into product but they’ve denied RT twice; they backed AMD over Nvidia, then they turned their back on Imagination Tech who had a trimmed RT tech for mobile at 1% of power draw.  Shockingly bad mgt decisions - Jobs is truly dead.  No RT = No future in graphics.  The real bugger is; AR needs this tech so I guess they’ve dropped that market too.
  • Reply 12 of 22
    Apple needs to get over itself and fully support Nvidia in eGPU (at least!) if they want pros to take the Mac platform seriously. Whatever reasons they have to shun Nvidia are not good enough to warrant blocking Mac pros from the most widely-supported graphics hardware on the market. Just figure it out already! 
  • Reply 13 of 22
    wizard69wizard69 Posts: 13,377member
    mcdave said:
    Good on Nvidia - RayTracing is the nirvana graphics tech.  If Apple have any play in this area they’d better get it into product but they’ve denied RT twice; they backed AMD over Nvidia, then they turned their back on Imagination Tech who had a trimmed RT tech for mobile at 1% of power draw.  Shockingly bad mgt decisions - Jobs is truly dead.  No RT = No future in graphics.  The real bugger is; AR needs this tech so I guess they’ve dropped that market too.
    Well that is one way to look at it.   However if Apple is as deep into AR as it looks you might think that they are building a GPU with their own ray tracing support.   In fact getting support up and running for Apple targeted technologies in their GPU is probably a very high priority.  Oh yes this could mean an Apple designed discreet GPU chip.    It is a possibility and frankly they have to do something with all of those AMD engineers they hired.
    watto_cobra
  • Reply 14 of 22
    Mike WuertheleMike Wuerthele Posts: 6,858administrator
    wizard69 said:
    mcdave said:
    Good on Nvidia - RayTracing is the nirvana graphics tech.  If Apple have any play in this area they’d better get it into product but they’ve denied RT twice; they backed AMD over Nvidia, then they turned their back on Imagination Tech who had a trimmed RT tech for mobile at 1% of power draw.  Shockingly bad mgt decisions - Jobs is truly dead.  No RT = No future in graphics.  The real bugger is; AR needs this tech so I guess they’ve dropped that market too.
    Well that is one way to look at it.   However if Apple is as deep into AR as it looks you might think that they are building a GPU with their own ray tracing support.   In fact getting support up and running for Apple targeted technologies in their GPU is probably a very high priority.  Oh yes this could mean an Apple designed discreet GPU chip.    It is a possibility and frankly they have to do something with all of those AMD engineers they hired.
    Apple IS designing GPUs. It has been for over a year.

    watto_cobra
  • Reply 15 of 22
    macxpressmacxpress Posts: 5,801member
    jmey267 said:
    lilred505 said:
    lilred505 said:
    so.. found this on another site..

    "Usernames like RoyTeX (RTX), Not_11, Mac-20, and Eight Tee all seem to spell out RTX 2080, a potential name for the new Nvidia card."

    any hope that "Mac-20" might refer to a Mac based card coming??

    heres to hoping
    Well, the Nvidia line prior to this one works now with the web driver package -- assuming you've got a Mac with PCI-E slots.
    Works isn't completely true.. if I can't use my system as normal, upgrade the os etc... then I don't consider it really working. Too many what ifs currently.  I am currently running a 1080ti in my cMP. So I understand all the limitations.
    What I am hoping for is native GPU support with boot support in the firmware of the card.  No flashing needed, like the 680. Also, Plug one into an EGPU and it just works..  I understand this is also up to apple.  We can hope apple has slapped Nvidia enough and has come back to their senses and use the superior GPU.
    Oh, I see what you're saying. Yeah, while you CAN get a Nvidia card in an eGPU, it is a pain.
    Its not a pain, all you have to do is make sure the nvidia webdriver is available before doing a security update. Again this is in a Mac Pro model
    I used to do this on my 2012 MacPro and it would never let me update to the latest NVIDIA driver until I ran the OS update (or Security Update), which is a catch22 situation. So I always had to pop in the standard ATI card that was Mac compatible, run the OS or Security Update and then the NVIDIA update, then I could swap back to the NVIDIA card. Pain in the ass is what it is. All that to run a non-supported video card. 

    Also what would happen is NVIDIA is sometimes SLOOWWWWW to update the driver so if you forgot and ran the OS or Security update you could be screwed for days until an update was released by NVIDIA unless you put the Mac supported card back in. 
    edited August 2018 watto_cobra
  • Reply 16 of 22
    mcdavemcdave Posts: 1,927member
    wizard69 said:
    mcdave said:
    Good on Nvidia - RayTracing is the nirvana graphics tech.  If Apple have any play in this area they’d better get it into product but they’ve denied RT twice; they backed AMD over Nvidia, then they turned their back on Imagination Tech who had a trimmed RT tech for mobile at 1% of power draw.  Shockingly bad mgt decisions - Jobs is truly dead.  No RT = No future in graphics.  The real bugger is; AR needs this tech so I guess they’ve dropped that market too.
    Well that is one way to look at it.   However if Apple is as deep into AR as it looks you might think that they are building a GPU with their own ray tracing support.   In fact getting support up and running for Apple targeted technologies in their GPU is probably a very high priority.  Oh yes this could mean an Apple designed discreet GPU chip.    It is a possibility and frankly they have to do something with all of those AMD engineers they hired.
    I really hope they are but I can’t see any evidence.  I’m not seeing GPU_family_5 crop up anywhere, only hints at a major performance boost in the “A12” Geekbench compute figures.  They showcased expanded Metal support for Ray Tracing at WWDC but unless they peeked at Imgtec’s WizardRTUs and thought they could do better without licensing there’s little indication (when is there ever?)

    RT is a broad-reaching tech across many form-factors and industries.  I’d like to think they could target/leapfrog PC gaming on the new MacBooks (I’m no gamer) with an A12X variant, and that’s why Fortnite isn’t Metal yet.  But part of me thinks they’ll come in at a lower TDP for the AR Glasses that Magic Leap should have been.
    watto_cobra
  • Reply 17 of 22
    tallest skiltallest skil Posts: 43,388member
    How about EFI boot menu support so that… ah, who am I kidding. They won’t care about supporting old Macs if Apple doesn’t even care. And the market isn’t big enough for them to bother. I’m still not sure if Mojave will run on my Mac Pro, and until we know what the next model will allow regarding PCIe cards… I’d hope Apple would support PCIe 5.0, but knowing them it’ll be 4.0 or even just 3.

    On a totally unrelated note, suicide’s looking better all the time.
    mcdave
  • Reply 18 of 22
    asciiascii Posts: 5,936member
    What an awesome piece of tech. I'm looking forward to playing the first games that use real time ray tracing.
  • Reply 19 of 22
    tallest skiltallest skil Posts: 43,388member
    ascii said:
    What an awesome piece of tech. I'm looking forward to playing the first games that use real time ray tracing.
    Hasn’t that basically been vaporware since the early ‘90s? I’d love it if games weren’t just stripping all gameplay functionality in favor of “storytelling” first, because otherwise there’s no real point to even including more accurate lighting representations. For example, take a look at Hitman 2016. The game world is fairly gorgeous when it’s set up normally. But once you start interacting with it, you realize just how broken both the lighting engine and the mechanics are. We apparently have to (maybe not) use stopgap solutions to represent accurate lighting, such as “baking in” the surface “bounce” of lighting into the surfaces themselves. If you’re in a hallway and you shoot out a light, sure, the brightness of the hallway will go down. But if you shoot out ALL the lights, it doesn’t get dark. The “bounce” of all the lights that used to be there remains as a… what is it, the “UV map”?… on all the surfaces, and it looks like it’s just a cloudy day outside.

    Real darkness in games–never mind the gameplay mechanics to take advantage of stealth in said darkness–is, for whatever reason, a very long way away. And developers don’t seem to care about it at all. It’s a real shame.
  • Reply 20 of 22
    asciiascii Posts: 5,936member
    ascii said:
    What an awesome piece of tech. I'm looking forward to playing the first games that use real time ray tracing.
    Hasn’t that basically been vaporware since the early ‘90s? I’d love it if games weren’t just stripping all gameplay functionality in favor of “storytelling” first, because otherwise there’s no real point to even including more accurate lighting representations. For example, take a look at Hitman 2016. The game world is fairly gorgeous when it’s set up normally. But once you start interacting with it, you realize just how broken both the lighting engine and the mechanics are. We apparently have to (maybe not) use stopgap solutions to represent accurate lighting, such as “baking in” the surface “bounce” of lighting into the surfaces themselves. If you’re in a hallway and you shoot out a light, sure, the brightness of the hallway will go down. But if you shoot out ALL the lights, it doesn’t get dark. The “bounce” of all the lights that used to be there remains as a… what is it, the “UV map”?… on all the surfaces, and it looks like it’s just a cloudy day outside.

    Real darkness in games–never mind the gameplay mechanics to take advantage of stealth in said darkness–is, for whatever reason, a very long way away. And developers don’t seem to care about it at all. It’s a real shame.
    You're right it's all smoke and mirrors with current games and not hard to see through the matrix.

    I watched the whole of Jensen Huang's keynote and it does seem like the age of vaporware might be coming to an end though. He showed real time raytracing for spinning industrial models/scenes at least, it remains to be seen how many fps a real game can get using this technology. 

    But just to have enough compute power to finally start modelling things properly (and therefore be on the path to photorealism) is exciting. Hopefully some new and interesting gameplay mechanics will follow! 
Sign In or Register to comment.