or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple looks to new NVIDIA graphics for Final Cut
New Posts  All Forums:Forum Nav:

Apple looks to new NVIDIA graphics for Final Cut

post #1 of 24
Thread Starter 
While aiming at its usual audience of 3D modelers and other graphics pros with its new Quadro FX video cards, NVIDIA has let slip that Apple is eyeing the technology closely as a key to driving its video editing tools.

The Santa Clara-based GPU maker was eager to promote its just-announced Quadro FX 4600 and 5600 workstation cards early this week, calling them a perfect fit for video editing thanks to a much-improved unified architecture that does much of the heavier work for effects processing and video decoding.

In doing so, however, the company has inadvertently revealed a glimpse at both Apple's hardware and software strategy for its pro clients. NVIDIA Professional Solutions manager Jeff Brown used a conversation with Gizmodo to tout the list of video software developers lining up to investigate the usefulness of his company's breakthrough for editing work, mentioning Apple in connection with the new video chipsets.

"Image processing is the fundamental algorithm set that video editing guys use. And traditionally that has been very CPU-centric," Brown told the website. "And now we're starting to see more and more image processing moving to the GPU. So folks like Adobe, Apple, Avid are excited about this concept. It gives them much, much higher levels of performance."

While Apple's interest in the new workstation cards is already likely to be high -- the company has used the earlier Quadro FX 4500 in the current Mac Pro and last-generation Power Mac G5 -- Brown's statement revealed Cupertino's interest in the new pro-level video hardware and its long-term impact on its editing software.

Apple has been a relative pioneer in using the more advanced features of video cards for pro software rendering. Motion, released in 2004, set itself apart from other motion graphics editors (including Adobe's After Effects) by relying on the 3D prowess of newer graphics chips to render effects and live previews that would have been difficult solely on a CPU.

Aperture and iPhoto 6 also use the pixel shaders on newer video hardware for non-destructive still image editing.

However, Apple's centerpiece Final Cut Pro editor has so far depended exclusively on CPUs to draw live footage, rapidly scaling down the quality of live previews as an editing crew shifts to editing HD video or multiple streams. Offloading some or all of this task to a particularly flexible video card like the new Quadro would free Final Cut Pro to render multiple HD streams in real-time without sacrificing accuracy -- especially for the fledgling 2K and 4K video resolutions that frequently demand specialized hardware.

With Apple confirming a special event of its own to take place at the NAB video expo just weeks after the NVIDIA announcement, the potential inherent to Brown's observations on the Mac maker's interest will likely be tested soon.
post #2 of 24
Quote:
Originally Posted by AppleInsider View Post

While aiming at its usual audience of 3D modelers and other graphics pros with its new Quadro FX video cards, NVIDIA has let slip that Apple is eyeing the technology closely as a key to driving its video editing tools.

The Santa Clara-based GPU maker was eager to promote its just-announced Quadro FX 4600 and 5600 workstation cards early this week, calling them a perfect fit for video editing thanks to a much-improved unified architecture that does much of the heavier work for effects processing and video decoding.

Makes a lot a sense
- the nVidia CUDA architecture looks great for high-end processing
- and if it's available on Macs, then it's going to make these really attractive for video editing/processing.


Perhaps the next line of MacBook Pros will also include nVidia GPUs to allow CUDA processing on these as well.
post #3 of 24
The comments look somewhat speculative to me, not necessarily a leak of information. If it was an unauthorized leak, then Apple would be willing to shoot themselves in the foot to penalize those that leak things like this.
post #4 of 24
Quote:
Originally Posted by JeffDM View Post

The comments look somewhat speculative to me, not necessarily a leak of information. If it was an unauthorized leak, then Apple would be willing to shoot themselves in the foot to penalize those that leak things like this.

I can see some thing like this coming intel / apple / nvidia VS AMD / ATI / MS
post #5 of 24
Quote:
Originally Posted by JeffDM View Post

The comments look somewhat speculative to me, not necessarily a leak of information.

I thought that was the whole point of this site
- lots of somewhat unfounded speculation!
post #6 of 24
He hasn't said anything we don't already know. It's no secret that many of the Pro apps like FCP aren't GPU enabled. Apple's been hyping GPUs for processing for quite some time now.

Now with the advent of multithreaded OpenGL in Leopard and the gonzo speed of the G80 Nvidia architecture you're going to definitely see GPU processing become a prominent feature in the next release of the Pro apps IMO.

I wouldn't want to edit 2k video now in FCP and 4k ain't happenin. But with this next version of FCS I'm thinking that 4k (Red.com and others) will be a possibility although there are few affordable 4k monitoring solutions.
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #7 of 24
I actually took this as non-news, as wasn't it at least 2 years ago that Apple announced RTExtreme, the FCP render component that offloads to the GPU?
post #8 of 24
Quote:
Originally Posted by ChevalierMalFet View Post

I actually took this as non-news, as wasn't it at least 2 years ago that Apple announced RTExtreme, the FCP render component that offloads to the GPU?

I think that only does it for the preview renders, not the final renders. What I heard in interviews with plug-in developers in the past is that they could not at that time rely on it to make final renders because the color quality wasn't always consistent. I think this has been improved.
post #9 of 24
How does NVidia's CUDA compare to Core Image and Core Video? And how about SLI support for using 2 cards in a Mac?
post #10 of 24
Quote:
Originally Posted by hmurchison View Post

He hasn't said anything we don't already know. It's no secret that many of the Pro apps like FCP aren't GPU enabled. Apple's been hyping GPUs for processing for quite some time now.

Now with the advent of multithreaded OpenGL in Leopard and the gonzo speed of the G80 Nvidia architecture you're going to definitely see GPU processing become a prominent feature in the next release of the Pro apps IMO.

I wouldn't want to edit 2k video now in FCP and 4k ain't happenin. But with this next version of FCS I'm thinking that 4k (Red.com and others) will be a possibility although there are few affordable 4k monitoring solutions.

The nVidia G80 series is quite a big step from what's gone before
- it not only features many unified shaders, but the whole architecture has been developed with General purpose computing in mind - GPGPU
- they have developed a whole new interface and mechanism for dealing with this
- in software terms, it sits in parallel with OpenGL (and DirectX on Windows), and provides a really clean & efficient way to access the power of the GPU
- it's called CUDA
- basically, it means that programmers can program in 'c' and move across code to the GPU as they feel necessary - and CUDA will take care of scheduling, loading etc.

Anyway, you can expect a >10x improvement for algorithms accelerated using this
- not all algorithms will work well on a GPU, but many video/image processing ones will.

So, I think nVidia is a step ahead on ATI on this one (although ATI have something similar, it's not so easy to work with, I think)

So the significance is that, for the first time, we have GPUs that can be harnessed for general purpose acceleration, in a general & straight forward way
- without having to treat data as textures etc.

If Apple were to standardise on this type of architecture for their professional Mac & MacBooks, all the software vendors are going to want to accelerate their apps in this way
- giving a really significant jump in performance.
post #11 of 24
Quote:
Originally Posted by Haggar View Post

How does NVidia's CUDA compare to Core Image and Core Video? And how about SLI support for using 2 cards in a Mac?

Thing about the G80 series and CUDA is that it makes it alot easier to program the GPU to other more general computing tasks

- also the G80 has a lot of features in it to make the GPU more efficient and more useful for general purpose computing.

I think Core Image/Video could be written to take advantage of these - if it isn't already.

And it should work fine with SLI - although I don't know if nVidia have their drivers sorted yet!

see here if you're interested
http://developer.nvidia.com/object/cuda.html
post #12 of 24
Quote:
Originally Posted by samurai1999 View Post

So, I think nVidia is a step ahead on ATI on this one (although ATI have something similar, it's not so easy to work with, I think)

AMD has the Close to Metal initiative, which from what I've gathered is something awfully like assembly code for GPUs. AMD wins on name though.
post #13 of 24
Quote:
Originally Posted by Zandros View Post

AMD has the Close to Metal initiative, which from what I've gathered is something awfully like assembly code for GPUs. AMD wins on name though.

Yeah, but it sounds a little difficult to work with!
\
post #14 of 24
Quote:
Originally Posted by Haggar View Post

How does NVidia's CUDA compare to Core Image and Core Video? And how about SLI support for using 2 cards in a Mac?

CUDA seems to be OS and drive independant. I wonder if they are licensing it for free.

I'm pretty damn excited for what current gen and next gen GPUs will be able to do. Image and Video processing takes grunt and frankly
I could not justify buying a $400 graphics card for gaming alone but if that card is going to work in so many more areas of my computing then
I could more easily justify the expenditure.

Here's a solid website for more info.

http://www.gpgpu.org/
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
He's a mod so he has a few extra vBulletin privileges. That doesn't mean he should stop posting or should start acting like Digital Jesus.
- SolipsismX
Reply
post #15 of 24
This is over reaching. The guy only said Apple and Adobe were interested in the technology.
He leaked nothing, and I hope no-one at Apple gives him any grief over this.
The evil that we fight is but the shadow of the evil that we do.
Reply
The evil that we fight is but the shadow of the evil that we do.
Reply
post #16 of 24
Quote:
Originally Posted by hmurchison View Post

CUDA seems to be OS and drive independant. I wonder if they are licensing it for free.

My reading of the original article was that it might mean that nVidia were also likely to make it available on MacOS as well
- it would certainly be in Apple's interest for them to do so.
post #17 of 24
Last time a graphics card manufacturer leaked something early Steve was not amused ...
Enjoying the new Mac Pro ... it's smokin'
Been using Apple since Apple ][ - Long on AAPL so biased
nMac Pro 6 Core, MacBookPro i7, MacBookPro i5, iPhones 5 and 5s, iPad Air, 2013 Mac mini.
Reply
Enjoying the new Mac Pro ... it's smokin'
Been using Apple since Apple ][ - Long on AAPL so biased
nMac Pro 6 Core, MacBookPro i7, MacBookPro i5, iPhones 5 and 5s, iPad Air, 2013 Mac mini.
Reply
post #18 of 24
This is nice to hear but the prices they'll charge will most likely keep me from buying this.
Quote:
Originally Posted by appleinsider vBulletin Message

You have been banned for the following reason:
Three personal attacks in one post. Congratulations.
Date the ban will be lifted:...
Reply
Quote:
Originally Posted by appleinsider vBulletin Message

You have been banned for the following reason:
Three personal attacks in one post. Congratulations.
Date the ban will be lifted:...
Reply
post #19 of 24
Quote:
Originally Posted by ecking View Post

This is nice to hear but the prices they'll charge will most likely keep me from buying this.

You should bear in mind that *all* the G80 series have this
- so if they put a G80M, G82M or G83M in a MacBookPro, you're going to end up with a pretty fast machine for CS3 or FCP.

So, to me it seems like a good way to differentiate the Pro line.
post #20 of 24
It would be more useful for Apple to use the 8800GTX since it provides the same performance at a much lower price than the Quadro, but of course they won't.

I doubt Apple will use CUDA any time soon if ever; Motion, Aperture, iPhoto and Core * have presumably invested tons of effort into shaders that they wouldn't want to duplicate.
post #21 of 24
Quote:
Originally Posted by wmf View Post

I doubt Apple will use CUDA any time soon if ever; Motion, Aperture, iPhoto and Core * have presumably invested tons of effort into shaders that they wouldn't want to duplicate.

I wouldn't be so sure. The point of the Core frameworks is to provide an abstraction so that the developers don't have to worry about the specifics of the hardware implementation. If you update a framework to work on another piece of hardware, that should mean that every program that use that framework should be able to use it without changing the program. Maybe it doesn't work that great in practice, but that's the basic idea.
post #22 of 24
Quote:
Originally Posted by JeffDM View Post

I wouldn't be so sure. The point of the Core frameworks is to provide an abstraction so that the developers don't have to worry about the specifics of the hardware implementation. If you update a framework to work on another piece of hardware, that should mean that every program that use that framework should be able to use it without changing the program. Maybe it doesn't work that great in practice, but that's the basic idea.

I would: CUDA is for General Purpose Computing on the GPU.

It does nothing for tasks at which graphics chips already excel, like shaders.
post #23 of 24
Quote:
Originally Posted by gregmightdothat View Post

I would: CUDA is for General Purpose Computing on the GPU.

It does nothing for tasks at which graphics chips already excel, like shaders.

Obviously, it depends on the task, but some tasks will get a 2x performance improvement using CUDA compared to shaders (on top of the 3x improvement going from G7x to G8x)
- some tasks more, some tasks less.

http://forums.nvidia.com/index.php?s...4&#entry161304

Plus, it's easier to write for CUDA, than for shaders, and more tasks will benefit, so it's win-win.

post #24 of 24
Quote:
Originally Posted by samurai1999 View Post

Obviously, it depends on the task, but some tasks will get a 2x performance improvement using CUDA compared to shaders (on top of the 3x improvement going from G7x to G8x)
- some tasks more, some tasks less.

http://forums.nvidia.com/index.php?s...4&#entry161304

Plus, it's easier to write for CUDA, than for shaders, and more tasks will benefit, so it's win-win.


That's great, but, you've entirely missed my point

We're talking about shaders. If what you're doing is a shader, writing for CUDA is stupid: you already have the shader.

Yes, other tasks do get performance benefits by using CUDA rather than shaders. But shaders don't need that overhead, and will remain shaders.
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple looks to new NVIDIA graphics for Final Cut