Raycer graphics coming soon, but not in the form we were hoping.

2

Comments

  • Reply 21 of 48
    haraldharald Posts: 2,152member
    IIRC, Raycer's technology was a very specific set of algorithmic goodness that worked out what you couldn't see in 3D models and let associated hardware get on with the job of accelerating the bits you *could* see.



    1) There's a world of difference between this and a general purpose DSP chip you could buy tomorrow from Texas Instruments that would speed up graphics and audio.



    2) Although the idea makes a lot of sense, there's also a lot of difference between real-time video display and the fairly specific usage attributed to the hardware above.



    I like the suggestion of a general purpose DSP chip that could lend a hand with audio and video (coreaudio's bottleneck is DSP power, my god, and the whole idea is a good way to route round slow PPC chips.)



    SO: this sounds like it might be true but have bugger all to do with Raycer (which was a tech bubble glory days recruitment drive that we now search desperately to justify in our own minds ... )



    [ 04-11-2002: Message edited by: Harald ]</p>
  • Reply 22 of 48
    splinemodelsplinemodel Posts: 7,311member
    If anyone remembers SGI's from the mid nineties, most of their advantage was due to special AV ASICs. It terms or basic, raw numbers, they didn't do any better than PPCs, but for video processing, they were great. It also helped for 3d.



    If Apple has a low cost plan to provide AV ASICs onto the motherboards of the future, that would be great. I wouldn't need to deal with expensive capture/export cards with poor support. I could simply play the movie full screen and record off of built-in S video from my PBG4. Very handy for simple video work. No Notebook I know of can do good video work. Keep in mind Apple's original market point for the PB G4.
  • Reply 23 of 48
    razzfazzrazzfazz Posts: 728member
    [quote]Originally posted by CCR65:

    <strong>The Amiga used Moto "040" processors for years and did some incredible things due to the distributed design.



    A chip like this would make more of a difference than another CPU.

    </strong><hr></blockquote>



    This worked quite well back then, but given the incredibly short development cycles and raw computational power of current CPUs, I doubt this would still be feasible today.

    The additional DSPs would probably become outdated with every new CPU generation.



    Bye,

    RazzFazz
  • Reply 24 of 48
    Sorry, but I don't buy it. Unless it's very, very fast and has the same APIs as Altivec (AV), I don't see this as a long term solution. One reason the RTMax card from Promax (not the Matrox RTMac card) was dropped was due to the power of AV and software rendering (faster the chip, the faster the render). I think Apple will continue to improve Altivec usage and Multi-processing abilities (how about 4 PPCs?) in the OS and Apps (plus better motherboards with higher memory bandwidth). These make more sense because they can be leveraged across all G4 (G5?) and higher boxes. Specialized A/V hardware has been tried in the past and always dropped. I don't think Apple will try that again.



    That's my $.02
  • Reply 25 of 48
    I'm doubtful, but there are a few things validating the rumor as well:



    For one, the postponement of new mobos is very odd, since Apple is so far behind with memory speed and so forth. But it would make sense if they were waiting for this Raycer technology.



    Another benefit is that this technology could be highly customizable, not only limited to MPEG2 decoding. If developers can leverage this technology for a wide range of functions, like audio dsp plug ins, different video effects, all real time, then it would be a killer addition to Apple's hardware.



    Another benefit is that it cannot be stolen by Micro$oft. I'm sure that by now Apple is sick of bustin' their guts over some awesome iApplication, only to see M$ rip it off a few months later. But with this Raycer technology, if it's as powerful as I suspect it would be, then Apple could implement many real time video and audio effects based on it, and these effects would be impossible for M$ to copy well. And if video and audio developers could leverage the raycer chip for their own use, it would benefit Apple even more, because Windows users would suddenly see warning messages on their apps like,"Realtime effects only supported in Mac OS X"!



    I'm skeptical, but if implemented properly, this Raycer technology could be Apple's silver bullet for assimilating audio and video editing markets.
  • Reply 26 of 48
    [quote]Originally posted by Junkyard Dawg:

    [/QB]<hr></blockquote>



    Hoo-raw!
  • Reply 27 of 48
    kd5mdkkd5mdk Posts: 81member
    [quote]so why would Apple add another set of decode functionality<hr></blockquote>

    He also mentioned encode, which is rather different, and from my point of view, more useful.



    I wouldn't be surprised if the deal really was only done to grab some good EEs. If Raycer had some talent in them, they could probably be applied to all sorts of teams Apple already had.
  • Reply 28 of 48
    programmerprogrammer Posts: 3,458member
    Apple has done innovative hardware before, so this might be something unexpected but I'm pretty sure it won't be what was described in the first message of this thread...



    1) The nVidia & ATI chipsets already do MPEG2 decoding.

    2) The G4's AltiVec unit is memory bound, so adding more processing power is just going to make it more memory bound (unless you add DDR that the processor can't take advantage of, which is just totally lame!).

    3) MPEG2 chips are nothing new so it would hardly be a real innovation and wouldn't give Apple much of an advantage.

    4) DSP chips were tried in the 660av/840av back in the '92 timeframe. Nobody wrote code for them, and what code Apple provided for them was unstable. Its hard enough to get developers to write AltiVec code, which is a lot easier than writing code for a stand-alone processor that is on a very small percentage of the machines.

    5) Raycer graphics was a 3D graphics company and had nothing to do with video. Its possible that Apple wants 3D graphics on the motherboard to do offscreen rendering, but this seems like quite an odd and specialized thing to do -- and it just can't compete with the powerhouse hardware the nVidia is putting out. If they do go this way then it seems like a serious step backwards, and a waste of internal resources at Apple. Believe me, Apple cannot compete at 3D hardware design with either nVidia or ATI -- not even close.

    6) Adding more powerful audio to the motherboard chipset would be a nice move, especially if it supports hardware encoding/decoding of MPEG and DTS digital streams & multi-channel output. The Raycer chip design talent could be applied to this... maybe.
  • Reply 29 of 48
    [quote] 4) DSP chips were tried in the 660av/840av back in the '92 timeframe. Nobody wrote code for them, and what code Apple provided for them was unstable. Its hard enough to get developers to write AltiVec code, which is a lot easier than writing code for a stand-alone processor that is on a very small percentage of the machines.

    5) Raycer graphics was a 3D graphics company and had nothing to do with video. Its possible that Apple wants 3D graphics on the motherboard to do offscreen rendering, but this seems like quite an odd and specialized thing to do -- and it just can't compete with the powerhouse hardware the nVidia is putting out. If they do go this way then it seems like a serious step backwards, and a waste of internal resources at Apple. Believe me, Apple cannot compete at 3D hardware design with either nVidia or ATI -- not even close.

    <hr></blockquote>





    DOEH!



    There goes my theory.



    Actually I'd rather see a beastly G5 clocked to 2.5 GHz, and SOON!
  • Reply 30 of 48
    [quote]1) The nVidia & ATI chipsets already do MPEG2 decoding.<hr></blockquote>



    yeah but it's all about encoding...look, based on my experience with video and 3d apps, there would be tremendous relief if apple could make things faster. examples:



    - rendering cinema4d scenes takes forever...

    - rendering finalcut movies takes forever...

    - compressing quicktime movies takes forever...

    - encoding dvd's takes a long time...



    my point is, make these things happen instantely. adding altivec was a good choice but is bound to the processor. imagine a unit that does nothing else than specialized rendering/encoding etc. wicked fast. no special programming necessary except for apple itself (modifying the os).

    don't tell me it's not possible because i know it is
  • Reply 31 of 48
    razzfazzrazzfazz Posts: 728member
    [quote]Originally posted by Strangelove:

    <strong>

    imagine a unit that does nothing else than specialized rendering/encoding etc. wicked fast. no special programming necessary except for apple itself (modifying the os).

    </strong><hr></blockquote>



    The point is that all the stuff you mentioned isn't handled by the OS at all. To use such a coprocessor, Final Cut Pro / Cinema4D / QuickTime / whatever would have to be modified, not OS X. And at least for the third-party stuff, this seems kinda unlikely.



    Bye,

    RazzFazz
  • Reply 32 of 48
    blackcatblackcat Posts: 697member
    [quote]Originally posted by RazzFazz:

    <strong>



    The point is that all the stuff you mentioned isn't handled by the OS at all. To use such a coprocessor, Final Cut Pro / Cinema4D / QuickTime / whatever would have to be modified, not OS X. And at least for the third-party stuff, this seems kinda unlikely.



    Bye,

    RazzFazz</strong><hr></blockquote>



    But Final Cut Pro / Cinema4D / QuickTime / whatever call OS routines to do this stuff. The OS just needs an update which looks for calls to XYZFunction and redirects it to XYZRaycerFunction. Thats how Amiga did it back in 1985.



    3rd party stuff might need updating but Apple stuff should use whatever gizmo a machines OS knows about.



    I'm over simplifying lots, but you get the idea.
  • Reply 33 of 48
    razzfazzrazzfazz Posts: 728member
    [quote]Originally posted by Blackcat:

    <strong>

    But Final Cut Pro / Cinema4D / QuickTime / whatever call OS routines to do this stuff.

    </strong><hr></blockquote>



    Nope.

    (Besides, did you really think Final Cut Pro is just a nice user front-end that calls some magic "RenderVideo()"-function in some OS X framework and doesn't do anything itself?)





    [quote]<strong>The OS just needs an update which looks for calls to XYZFunction and redirects it to XYZRaycerFunction. Thats how Amiga did it back in 1985.

    </strong><hr></blockquote>



    Maybe that's how it was with the Amiga, but there just is no "CreateMPEG2File()" function or anything similar inside the Mac OS X frameworks.



    Apple can modify system calls to use that hardware, but this only helps where stuff is handled through system calls in the first place. Especially in computationally intensive apps, that usually isn't much.



    Bye,

    RazzFazz
  • Reply 34 of 48
    blackcatblackcat Posts: 697member
    [quote]Originally posted by RazzFazz:

    <strong>



    Maybe that's how it was with the Amiga, but there just is no "CreateMPEG2File()" function or anything similar inside the Mac OS X frameworks.

    RazzFazz</strong><hr></blockquote>



    No, of course not but FCP will be using groups of graphics routines from the OS and video routines from QT - developers are not hand coding this stuff each time, they use building blocks. FCP needs Quicktime to work, because it uses the QT video processing functions.



    If the OS knows it needs to farm out graphics related stuff to Raycer, everything which uses that stuff will benefit.
  • Reply 35 of 48
    matsumatsu Posts: 6,558member
    Re-iterating that I know nothing 'cept for what I read, I'd like to think of a few market feasibility arguments.



    What I'm getting is that Apple certainly could give you a specialized co-processor built into the Mobo I/O. But upon further inspection, the best case scenario is that it does the job better than 3rd party video card solutions for about 1 product generation and then you're stuck with a specialized chip that needs specialized coding and doesn't take advantage of video API's built into standard solutions. Ultimately the Vid card guys do it better-faster-cheaper, so unless they can produce a quantumm leap in performance, ie. my 30min high res QT video encodes in 10-15seconds, it isn't really worth the cost and complexity.



    OTOH, did Raycer get bought for technology or talent? And, could he talent be working on something else? Apple needs to modernize the audio sub-system on macs. OSX was the first step, multi-channel (high-res) formats are now here to stay, especially in the film market (which Apple covets) Audio standards tend to have a much longer shelf life than video/3-d render technologies. A good DSP to take care of all your audio needs could live a long and happy life on Mac motherboards for a few generations.



    I'm going to bet on better audio for all macs.



    There may be bright spots on the horizon for video too. The graphics guys don't show any sign of slowing down. Altivec is good for render/encoding duties. I once read of the possibilities for Apple and X graphics cards to speed up 3-d render in a kinda set-em up, knock-em down approach to 3-d rendering where Altivec does all of the set-up (that 3-d cards can't do) so that it can supply the work-load tailored to what functions 3-d cards can perform VERY quickly (not just frame-rates, fill-rate, and geometry.) Is OpenGL 2 ever going to get here? Writing some kinda super tight OS-API rountines might allow Apple to put the considerable power of modern graphics cards to use for something better than just Quake? Graphics guys might be good to have around for such a mission?
  • Reply 36 of 48
    razzfazzrazzfazz Posts: 728member
    [quote]Originally posted by Blackcat:

    <strong>

    No, of course not but FCP will be using groups of graphics routines from the OS and video routines from QT - developers are not hand coding this stuff each time, they use building blocks. FCP needs Quicktime to work, because it uses the QT video processing functions.

    </strong><hr></blockquote>



    Sure, this is how video acceleration works on modern graphics hardware after all, but as I said above, this only has a limited scope.

    Let's assume you want to apply some filters in Photoshop. While PS does use the OS to draw its interface, to use the clipboard, or to facilitate fast scrolling, to read a file, etc. pp., the routine that actually performs the filter takes place inside PS (or, to be exact, in the associated plugin) - it's really just some floating point code. As such, there's little opportunity for OS X to intercept that - it can't possibly intercept any and all FP code and detect whether it might be possible to offload it to our hypothetical coprocessor. All it sees is that PS performs a bunch of FP operations, but it doesn't "know" that this in fact is a filter that is performed on an image.

    Pretty much the same is true if you look at C4D or FCP - unless either the program or the respective plugins or codecs are specifically rewritten to take advantage of such a coprocessor, those programs couldn't use it for more than simple, OS-related functions (which usually don't account for a lot of CPU utilisation.





    [quote]<strong>

    If the OS knows it needs to farm out graphics related stuff to Raycer, everything which uses that stuff will benefit.

    </strong><hr></blockquote>



    The problem is, how does the OS know that some arbitrary code that happens to contain a lot of FP instructions is actually "graphics related stuff"?



    Bye,

    RazzFazz
  • Reply 37 of 48
    [quote]Originally posted by RazzFazz:

    <strong>



    The problem is, how does the OS know that some arbitrary code that happens to contain a lot of FP instructions is actually "graphics related stuff"?



    Bye,

    RazzFazz</strong><hr></blockquote>





    Boggle... doesn't the API do that?



    <img src="confused.gif" border="0">
  • Reply 38 of 48
    jlljll Posts: 2,713member
    [quote]Originally posted by RazzFazz:

    <strong>Let's assume you want to apply some filters in Photoshop. While PS does use the OS to draw its interface, to use the clipboard, or to facilitate fast scrolling, to read a file, etc. pp., the routine that actually performs the filter takes place inside PS (or, to be exact, in the associated plugin) - it's really just some floating point code. As such, there's little opportunity for OS X to intercept that - it can't possibly intercept any and all FP code and detect whether it might be possible to offload it to our hypothetical coprocessor. All it sees is that PS performs a bunch of FP operations, but it doesn't "know" that this in fact is a filter that is performed on an image.

    Pretty much the same is true if you look at C4D or FCP - unless either the program or the respective plugins or codecs are specifically rewritten to take advantage of such a coprocessor, those programs couldn't use it for more than simple, OS-related functions (which usually don't account for a lot of CPU utilisation.

    </strong><hr></blockquote>



    I don't think you get his point. FCP doesn't contain code to encode an MPEG2 file - it asks QT to do it, and if QT is written to take advantage of a new chip, FCP doesn't have to be recoded.



    Lots of apps depends on QT (QT is a lot more than a audio/video player which people tend to forget when comparing it with WMP/Real btw.)
  • Reply 39 of 48
    dansg3dansg3 Posts: 12member
    I had the same questions too at first about why to bother with another video chip when the current graphics cards are doing pretty well at things like decoding mpeg2 and the processor encoding faster than real time. And for those of you wondering whether you would have to rewrite applications to take advantage of it I have an answer now. First of all after having another discussion with my friend I believe that a lot of you are right about it not being a simple graphics chip, but rather a DSP. This chip will have 4 - 8 ALTIVEC units on it (straight from my source) and will automatically take on anything that can be written for altivec. So AUDIO and VIDEO can benefit from this extra processing. I think this will be a tremendous boost in performance for the new models we should be seeing. Another thing I thought of that might go along with this, and you guys can make your own decisions, is if say apple went with a G5 variant from IBM that didn't have altivec, then the software could still be using the altivec on this Raycer/northbridge chip. What do you think?
  • Reply 40 of 48
    rolandgrolandg Posts: 632member
    Does Apple have the rights on AltiVec?
Sign In or Register to comment.