Video editing on next-gen GPUs

Posted:
in Future Apple Hardware edited January 2014
<a href="http://tech-report.com/etc/2002q3/nextgen-gpus/index.x?pg=1"; target="_blank">Tech Report</a> is discussing next-gen video cards and this passage from Page 5 caught my attention:



"Note that NV30 can, like the R300, apply pixel shader effects to video streams. This capacity changes the image processing and video editing games, because Photoshop-like effects can be applied to incoming video streams in real time."



I didn't know the R300 and NV30 could do that - and now that I do know I wonder just how much better a bus interface the next few PowerMac models will end up with. If the memory is 333 mhz DDR and the data is only sent to the GPU, could this mean realtime video editing without even touching the CPU (all that much)?



Probably this still wouldn't help much with huge Photoshop files, but when 3D rendering and video editing can be done with a ~300 $ 3D card there isn't much of a need for a uber-2ghz CPU anyway. And the R300 can work in a symmetric setup of up to 256 chips, which would mean realtime rendering (as pointed in the article) and video editing and any other calculation the GPUs can do is pretty close to come to the desktop. Now the only thing that could make Apple computers faster than PCs would be sheetload of bandwidth, as I think that CPU speed wont matter that much anymore.



Did anyone actually hear more about those R300 drivers for 3D packages? Could it be that in the future any good multimedia software is one that comes with ATI/nVidia drivers instead of AltiVec opimization?

Comments

  • Reply 1 of 4
    hmurchisonhmurchison Posts: 12,425member
    That should benefit Apple. I hear that Quartz Extreme is very beneficial to compositing and these features sound like they would be very complimentary.
  • Reply 2 of 4
    macroninmacronin Posts: 1,174member
    All things considered...



    If Quartz extreme works as advertised (and, by all accounts here & elsewhere, it should), and if the rumors of an Apple/nVidia co-designed AGP board come to fruition, along with the recently rumored IBM CPUs...



    Well, let us just say, the naysayers on the PC/Irix side of things using Maya will be changing their tune in the near future...!



    (Near future being defined as up to one iCal year)



    I can hardly wait!



    Mmmm... New dual G5 (or whatever they call it) boxen running Maya Unlimited (by the time the second rev. of this mythical PowerMac workstation is out, so will be Unlimited) on a brace of HD Cinema Displays...!



    Oh yeah, my mantra...



    Apple/nVidia co-designed video card!

    - dual nv30 GPUs

    - 512MB DDRII RAM

    - dual ADC ports



    Mmmm!!!
  • Reply 3 of 4
    xypexype Posts: 672member
    [quote]Originally posted by MacRonin:

    <strong>Apple/nVidia co-designed video card!</strong><hr></blockquote>



    I don't even think this would be all that good - nVidia is not everything and I for my part would rather buy an ATI card.



    I would much rather see 2-4 diefferent cards awailible for the Mac platform (say ATI, nVidia, Matrox, 3DLabs) since once we'd have a standard (OpenGL 2.0?) that hardware companies could target their products at we'd have healthy compeition and also many different options too choose from. I can imagine Matrox making a nice video-rendering card, whereas 3DLabs/nV/ATI could saturate the gaming, pro 3D and multimedia market. It wouldn't be clever of Apple to focus on one supplier only. Unless they want to be stuck with 500 mhz GPUs forever...



    But funnily enough, while we discussed such possibilities here some weeks ago ATI was already finishing a solution. I really suggest people to go to ATIs website and take a look at those R300 videos..
  • Reply 4 of 4
    macroninmacronin Posts: 1,174member
    Whatever works best for the following:



    Mac OS X

    Maya Complete (Unlimited - 'coming soon!')

    Final Cut Pro

    Shake

    Photoshop



    And, for the 'old school goodness'...



    Doom3... Hey, if I have a kick arse 3d card in my workstation, might as well cut loose with a little random violence on occasion...!



    But I am partial to nVidia...
Sign In or Register to comment.