The Ultimate Compositing Station

Posted:
in Future Apple Hardware edited January 2014
This is a multipart question, speculation, wish list, that probably fits into future hardware as well as anywhere else.





First the Questions:

I remember reading in these AI forums, before last New-York macworld, about Quartz extreme. Somebody had claimed to have been at a developer conference and seen and apple demo of FCP that had 50+ layers being rendered in real time. There was some talk of quartz extreme and how you could make every layer a texture on a 3d surface that is rendered by the graphics chip on the video card. ( I understand it that is how the windows in the finder are render when in 10.2).

So why haven't we seen this?



It sounds absolutely incredible to anyone who has done some serious editing with FCP. Clips can take forever (days) to render.



Are we only waiting for final cut pro 4.0 and this could work with all our current/new graphics cards?



Would it be necessary to buy a special FCP4.0 PCI card from apple (perhaps built by the raycer people) to get this sort of functionality?



Has anyone heard any rumors at all in this direction? Any obvious technical difficulties that also tand in the way?





Since I am an independent film maker, I think about this a lot,

If apple really wants to create a desktop video revolution, these are my suggestions:





Video Camera:

Together with one big video camera manufacture's produce a miniDV video camera to compete with the likes of PD150, Panasonic DVX-100, etc

feature list:

All features found in DVX-100, plus,



1. use higher resolution CCDs so that you can capture HD. <a href="http://pro.jvc.com/prof/Attributes/features.jsp?tree=&searchModel=&model_id=MDL101394 " target="_blank">Already been done.</a>



2. The camera should be able to capture at different frame rates. So that one can create real slow motion by filming at 50fps and playing it back at 25fps. This has also been done with a digital camera, " target="_blank">but only in HD so far.





3.(little more out there) Have depth sensing recording mechanism that would be recorded as an second alpha channel (mask). This is possible and there exists a functional product <a href="http://www.panasonic.com/PBDS/subcat/Products/cams_ccorders/f_aj-hdc27v.html"; target="_blank">here</a>. But it's ****ing expensive too buy, much more than the parts are worth, your paying for technology.



\tOf course a suitable codec will have to be chosen, modified, for the various functions. And this of course will also require tight integration with FCP. But since apple is the only company that controls the whole thing, it could actually pull these off. The technology is there, you just need somebody to integrate it into something useful and easy to use, and that is what apple does best, I can only hope.





Computer



Of course the sort of 50+ layers of realtime editing would be a dream for any FCP, Shake, anything similar user. But even if this becomes possible, you still have to render the separate layers with all your filters and the amount of time that can take will still be huge. As mention in other threads, It would be fantastic, if using Firewire800, the new ip over firewire and rendezvous, one could just plug one mac into another and the processing power would get distributed to the tasks at hand, making the administration software really easy, so that setting up a render farm is really plug it in and turn it on. The only technology that I can't point you at is the clustering software. But if it only worked for FCP and shake, that would be a start, and since you can send a frame to each computer, the software cannot be hard to write. I can understand apple wanting a proper implementation of clustering software, and perhaps is waiting for a new Kernal update(?) to make this proposition possible at all and hopefully efficient.





bitchs

Comments

  • Reply 1 of 4
    Interesting.



    I can only hope that Apple is able to pull this off. Shake alone is going to be tough to update and along with FCP4 and the other "Pro" Apple apps let's hope their programmers have some suprises.



    I still find it odd that they purchase both Shake and Rayz.
  • Reply 2 of 4
    [quote]Originally posted by sumoftheparts:

    <strong>First the Questions:

    I remember reading in these AI forums, before last New-York macworld, about Quartz extreme. Somebody had claimed to have been at a developer conference and seen and apple demo of FCP that had 50+ layers being rendered in real time. There was some talk of quartz extreme and how you could make every layer a texture on a 3d surface that is rendered by the graphics chip on the video card. ( I understand it that is how the windows in the finder are render when in 10.2).

    So why haven't we seen this?</strong><hr></blockquote>



    I mentioned film compositing and 40 layers of terminal windows <a href="http://forums.appleinsider.com/cgi-bin/ultimatebb.cgi?ubb=get_topic&f=5&t=000692&p=9"; target="_blank">in this thread</a>. Is that what you're referring to?



    I'm sure Apple will eventually implement OpenGL-based compositing (a la Quartz Extreme) in Final Cut Pro, even if it's only for RT preview. In my experience with writing applications that use QE style compositing for video, the biggest bottleneck has been decompressing DV frames. My graphics card (GeForce 3) chugs along merrily with whatever I throw at it, but my test machines (G4/450DP & G4/733) can't decompress more than 2 streams of DV video at a time. But, if we use uncompressed streams, then we're talking.



    QuickTime actually isn't designed too well to handle this style of compositing. You need some annoying hacks in QuickTime in order to get maximum efficiency in decompressing and uploading frames as textures to the graphics card. I'm sure once we get QuickTime up to speed, and Apple continues to improve ReadPixels performance in OpenGL, we'll see more QE style compositing and rendering in video applications.



    [ 02-23-2003: Message edited by: King Chung Huang ]</p>
  • Reply 3 of 4
    Hardware improvements will make a big difference here too: AGP 8x, ATI9700 & NV30, PPC970.



    Also of interest is the idea of using the 3D engine's vertex and pixel shaders to do much of your filtering. I'm not sure but I think the PixelShox guys might already be doing some of this.
  • Reply 4 of 4
    The 9700 is (supposedly) available as a BTO option on Apple's site <img src="graemlins/lol.gif" border="0" alt="[Laughing]" /> (yeah, right - it has had a "coming soon" status for 1 or 2 months, now) And anyway, the 9700 would be a helluva lot better with AGP 8X...but if we got AGP 8X on the Mac, we might as well go buy a FireGL instead of a 9700...



    [ 02-23-2003: Message edited by: os10geek ]</p>
Sign In or Register to comment.