Apple looks to new NVIDIA graphics for Final Cut

Posted:
in Future Apple Hardware edited January 2014
While aiming at its usual audience of 3D modelers and other graphics pros with its new Quadro FX video cards, NVIDIA has let slip that Apple is eyeing the technology closely as a key to driving its video editing tools.



The Santa Clara-based GPU maker was eager to promote its just-announced Quadro FX 4600 and 5600 workstation cards early this week, calling them a perfect fit for video editing thanks to a much-improved unified architecture that does much of the heavier work for effects processing and video decoding.



In doing so, however, the company has inadvertently revealed a glimpse at both Apple's hardware and software strategy for its pro clients. NVIDIA Professional Solutions manager Jeff Brown used a conversation with Gizmodo to tout the list of video software developers lining up to investigate the usefulness of his company's breakthrough for editing work, mentioning Apple in connection with the new video chipsets.



"Image processing is the fundamental algorithm set that video editing guys use. And traditionally that has been very CPU-centric," Brown told the website. "And now we're starting to see more and more image processing moving to the GPU. So folks like Adobe, Apple, Avid are excited about this concept. It gives them much, much higher levels of performance."



While Apple's interest in the new workstation cards is already likely to be high -- the company has used the earlier Quadro FX 4500 in the current Mac Pro and last-generation Power Mac G5 -- Brown's statement revealed Cupertino's interest in the new pro-level video hardware and its long-term impact on its editing software.



Apple has been a relative pioneer in using the more advanced features of video cards for pro software rendering. Motion, released in 2004, set itself apart from other motion graphics editors (including Adobe's After Effects) by relying on the 3D prowess of newer graphics chips to render effects and live previews that would have been difficult solely on a CPU.



Aperture and iPhoto 6 also use the pixel shaders on newer video hardware for non-destructive still image editing.



However, Apple's centerpiece Final Cut Pro editor has so far depended exclusively on CPUs to draw live footage, rapidly scaling down the quality of live previews as an editing crew shifts to editing HD video or multiple streams. Offloading some or all of this task to a particularly flexible video card like the new Quadro would free Final Cut Pro to render multiple HD streams in real-time without sacrificing accuracy -- especially for the fledgling 2K and 4K video resolutions that frequently demand specialized hardware.



With Apple confirming a special event of its own to take place at the NAB video expo just weeks after the NVIDIA announcement, the potential inherent to Brown's observations on the Mac maker's interest will likely be tested soon.
«1

Comments

  • Reply 1 of 23
    Quote:
    Originally Posted by AppleInsider View Post


    While aiming at its usual audience of 3D modelers and other graphics pros with its new Quadro FX video cards, NVIDIA has let slip that Apple is eyeing the technology closely as a key to driving its video editing tools.



    The Santa Clara-based GPU maker was eager to promote its just-announced Quadro FX 4600 and 5600 workstation cards early this week, calling them a perfect fit for video editing thanks to a much-improved unified architecture that does much of the heavier work for effects processing and video decoding.




    Makes a lot a sense

    - the nVidia CUDA architecture looks great for high-end processing

    - and if it's available on Macs, then it's going to make these really attractive for video editing/processing.





    Perhaps the next line of MacBook Pros will also include nVidia GPUs to allow CUDA processing on these as well.

  • Reply 2 of 23
    jeffdmjeffdm Posts: 12,951member
    The comments look somewhat speculative to me, not necessarily a leak of information. If it was an unauthorized leak, then Apple would be willing to shoot themselves in the foot to penalize those that leak things like this.
  • Reply 3 of 23
    Quote:
    Originally Posted by JeffDM View Post


    The comments look somewhat speculative to me, not necessarily a leak of information. If it was an unauthorized leak, then Apple would be willing to shoot themselves in the foot to penalize those that leak things like this.



    I can see some thing like this coming intel / apple / nvidia VS AMD / ATI / MS
  • Reply 4 of 23
    Quote:
    Originally Posted by JeffDM View Post


    The comments look somewhat speculative to me, not necessarily a leak of information.



    I thought that was the whole point of this site

    - lots of somewhat unfounded speculation!

  • Reply 5 of 23
    hmurchisonhmurchison Posts: 12,425member
    He hasn't said anything we don't already know. It's no secret that many of the Pro apps like FCP aren't GPU enabled. Apple's been hyping GPUs for processing for quite some time now.



    Now with the advent of multithreaded OpenGL in Leopard and the gonzo speed of the G80 Nvidia architecture you're going to definitely see GPU processing become a prominent feature in the next release of the Pro apps IMO.



    I wouldn't want to edit 2k video now in FCP and 4k ain't happenin. But with this next version of FCS I'm thinking that 4k (Red.com and others) will be a possibility although there are few affordable 4k monitoring solutions.
  • Reply 6 of 23
    I actually took this as non-news, as wasn't it at least 2 years ago that Apple announced RTExtreme, the FCP render component that offloads to the GPU?
  • Reply 7 of 23
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by ChevalierMalFet View Post


    I actually took this as non-news, as wasn't it at least 2 years ago that Apple announced RTExtreme, the FCP render component that offloads to the GPU?



    I think that only does it for the preview renders, not the final renders. What I heard in interviews with plug-in developers in the past is that they could not at that time rely on it to make final renders because the color quality wasn't always consistent. I think this has been improved.
  • Reply 8 of 23
    haggarhaggar Posts: 1,568member
    How does NVidia's CUDA compare to Core Image and Core Video? And how about SLI support for using 2 cards in a Mac?
  • Reply 9 of 23
    Quote:
    Originally Posted by hmurchison View Post


    He hasn't said anything we don't already know. It's no secret that many of the Pro apps like FCP aren't GPU enabled. Apple's been hyping GPUs for processing for quite some time now.



    Now with the advent of multithreaded OpenGL in Leopard and the gonzo speed of the G80 Nvidia architecture you're going to definitely see GPU processing become a prominent feature in the next release of the Pro apps IMO.



    I wouldn't want to edit 2k video now in FCP and 4k ain't happenin. But with this next version of FCS I'm thinking that 4k (Red.com and others) will be a possibility although there are few affordable 4k monitoring solutions.



    The nVidia G80 series is quite a big step from what's gone before

    - it not only features many unified shaders, but the whole architecture has been developed with General purpose computing in mind - GPGPU

    - they have developed a whole new interface and mechanism for dealing with this

    - in software terms, it sits in parallel with OpenGL (and DirectX on Windows), and provides a really clean & efficient way to access the power of the GPU

    - it's called CUDA

    - basically, it means that programmers can program in 'c' and move across code to the GPU as they feel necessary - and CUDA will take care of scheduling, loading etc.



    Anyway, you can expect a >10x improvement for algorithms accelerated using this

    - not all algorithms will work well on a GPU, but many video/image processing ones will.



    So, I think nVidia is a step ahead on ATI on this one (although ATI have something similar, it's not so easy to work with, I think)



    So the significance is that, for the first time, we have GPUs that can be harnessed for general purpose acceleration, in a general & straight forward way

    - without having to treat data as textures etc.



    If Apple were to standardise on this type of architecture for their professional Mac & MacBooks, all the software vendors are going to want to accelerate their apps in this way

    - giving a really significant jump in performance.
  • Reply 10 of 23
    Quote:
    Originally Posted by Haggar View Post


    How does NVidia's CUDA compare to Core Image and Core Video? And how about SLI support for using 2 cards in a Mac?



    Thing about the G80 series and CUDA is that it makes it alot easier to program the GPU to other more general computing tasks



    - also the G80 has a lot of features in it to make the GPU more efficient and more useful for general purpose computing.



    I think Core Image/Video could be written to take advantage of these - if it isn't already.



    And it should work fine with SLI - although I don't know if nVidia have their drivers sorted yet!



    see here if you're interested

    http://developer.nvidia.com/object/cuda.html
  • Reply 11 of 23
    zandroszandros Posts: 537member
    Quote:
    Originally Posted by samurai1999 View Post


    So, I think nVidia is a step ahead on ATI on this one (although ATI have something similar, it's not so easy to work with, I think)



    AMD has the Close to Metal initiative, which from what I've gathered is something awfully like assembly code for GPUs. AMD wins on name though.
  • Reply 12 of 23
    Quote:
    Originally Posted by Zandros View Post


    AMD has the Close to Metal initiative, which from what I've gathered is something awfully like assembly code for GPUs. AMD wins on name though.



    Yeah, but it sounds a little difficult to work with!

    \
  • Reply 13 of 23
    hmurchisonhmurchison Posts: 12,425member
    Quote:
    Originally Posted by Haggar View Post


    How does NVidia's CUDA compare to Core Image and Core Video? And how about SLI support for using 2 cards in a Mac?



    CUDA seems to be OS and drive independant. I wonder if they are licensing it for free.



    I'm pretty damn excited for what current gen and next gen GPUs will be able to do. Image and Video processing takes grunt and frankly

    I could not justify buying a $400 graphics card for gaming alone but if that card is going to work in so many more areas of my computing then

    I could more easily justify the expenditure.



    Here's a solid website for more info.



    http://www.gpgpu.org/
  • Reply 14 of 23
    frank777frank777 Posts: 5,839member
    This is over reaching. The guy only said Apple and Adobe were interested in the technology.

    He leaked nothing, and I hope no-one at Apple gives him any grief over this.
  • Reply 15 of 23
    Quote:
    Originally Posted by hmurchison View Post


    CUDA seems to be OS and drive independant. I wonder if they are licensing it for free.




    My reading of the original article was that it might mean that nVidia were also likely to make it available on MacOS as well

    - it would certainly be in Apple's interest for them to do so.
  • Reply 16 of 23
    MacProMacPro Posts: 19,728member
    Last time a graphics card manufacturer leaked something early Steve was not amused ...
  • Reply 17 of 23
    eckingecking Posts: 1,588member
    This is nice to hear but the prices they'll charge will most likely keep me from buying this.
  • Reply 18 of 23
    Quote:
    Originally Posted by ecking View Post


    This is nice to hear but the prices they'll charge will most likely keep me from buying this.



    You should bear in mind that *all* the G80 series have this

    - so if they put a G80M, G82M or G83M in a MacBookPro, you're going to end up with a pretty fast machine for CS3 or FCP.



    So, to me it seems like a good way to differentiate the Pro line.
  • Reply 19 of 23
    wmfwmf Posts: 1,164member
    It would be more useful for Apple to use the 8800GTX since it provides the same performance at a much lower price than the Quadro, but of course they won't.



    I doubt Apple will use CUDA any time soon if ever; Motion, Aperture, iPhoto and Core * have presumably invested tons of effort into shaders that they wouldn't want to duplicate.
  • Reply 20 of 23
    jeffdmjeffdm Posts: 12,951member
    Quote:
    Originally Posted by wmf View Post


    I doubt Apple will use CUDA any time soon if ever; Motion, Aperture, iPhoto and Core * have presumably invested tons of effort into shaders that they wouldn't want to duplicate.



    I wouldn't be so sure. The point of the Core frameworks is to provide an abstraction so that the developers don't have to worry about the specifics of the hardware implementation. If you update a framework to work on another piece of hardware, that should mean that every program that use that framework should be able to use it without changing the program. Maybe it doesn't work that great in practice, but that's the basic idea.
Sign In or Register to comment.