Quartz Extreme/Future Motherboard Question

Posted:
in Future Apple Hardware edited January 2014
Raycer.



OK, I had to get that out in front so I could piss off all of you "Raycer Chip" haters.



Now, prior to QE being announced, no one thought it was intelligent to make a Quartz dedicated chip for the motherboard because it would be too damn expensive. Now QE is announced things to me look different.



Would it be possible and cost effective for Apple to produce a dedicated (most likely) Nvidia graphics chip that would be dedicated to rendering video? Although I can't remember the reasons why, I do remember reading that a machine can't have two AGP busses. But would it be possible to piggy back on an AGP bus? Possibly splitting the bandwidth much like the dual monitor cards do now, only with half of an 8X AGP bus going to a render chip?



If possible, it seems like a good direction to go to get workstation-like production out of industry standard parts.



Now just to clarify, I don't think/hope/care if this dedicated bus/chip solution has anything to do with the Raycer purchase. To me, Quartz Extreme makes this possible, not any specific hardware solution. That also gives Apple a leg up on the competition.

Comments

  • Reply 1 of 8
    amorphamorph Posts: 7,112member
    The simplest thing is probably to wait for the next generation or two of video cards.



    Why?



    First, AGP is on its way out as a CPU-to-card bus, because it's a fancy PCI and PCI is being deprecated as a system bus. There are a lot of contending standards to replace it which are faster and cheaper to implement. If Apple is going to a HyperTransport or RapidIO based fabric, they can weave the video card (bus) into that and get fat bandwidth to and from memory and to and from the processor. If the fabric is switched, so much the better (but that's the sort of thing we'll "only" see in the high end to begin with). Note that you can still have an AGP slot and an AGP bus in this scenario; it's just that the bus would now hook the card up to the HT/RIO fabric, rather than going directly to the CPU(?).



    Second, video cards are one or two generations away from being programmable enough to do rendering, in which case you get your wish - a dedicated graphics coprocessor - without Apple being tied to a specific vendor, and without Apple relying on a custom, non-upgradeable ASIC.
  • Reply 2 of 8
    lemon bon bonlemon bon bon Posts: 2,383member
    Graphics cards turning into co-processors for rendering. That...sounds amazing.



    Lemon Bon Bon
  • Reply 3 of 8
    tsukuritetsukurite Posts: 192member
    given the number of transistors that are being packing into GPU's, I wouldn't be surprised. Don't the latest chips far outstrip the actual CPU? <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
  • Reply 4 of 8
    bungebunge Posts: 7,329member
    A programmable GPU would actually be helpful. I guess my thought was less about an individual chip on the motherboard and more along the lines of a second video card that was dedicated for rendering. Hell, it could BE an industry standard AGP card if the motherboard was juried properly (as are the current dual monitor cards.)



    This makes it a relatively cheap and helpful upgrade that gives Apple a decided advantage over the competition. I think.
  • Reply 5 of 8
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by bunge:

    <strong>A programmable GPU would actually be helpful. I guess my thought was less about an individual chip on the motherboard and more along the lines of a second video card that was dedicated for rendering. Hell, it could BE an industry standard AGP card if the motherboard was juried properly (as are the current dual monitor cards.)



    This makes it a relatively cheap and helpful upgrade that gives Apple a decided advantage over the competition. I think.</strong><hr></blockquote>



    What you want has arrived in the form of the Radeon9700, and soon the nVidia nv30. These are very programmable and extremely fast. The bus interface to them is getting faster, and they come with tons of VRAM. They can now store their results in full precision floating point, and their pipelines are float from start to end. And they are blisteringly fast -- traditional non-realtime shaders can be implemented directly on the programmable hardware and it will run in real-time or near real-time.
  • Reply 6 of 8
    yevgenyyevgeny Posts: 1,148member
    [quote]Originally posted by Programmer:

    <strong>

    What you want has arrived in the form of the Radeon9700, and soon the nVidia nv30. These are very programmable and extremely fast. The bus interface to them is getting faster, and they come with tons of VRAM. They can now store their results in full precision floating point, and their pipelines are float from start to end. And they are blisteringly fast -- traditional non-realtime shaders can be implemented directly on the programmable hardware and it will run in real-time or near real-time.</strong><hr></blockquote>



    Agreed. What I can never understand about all the Raycer rumors is why anyone thinks that Apple is going to reinvent the wheel when there is already so much competetion in the GPU market as it is. The days of Apple making all their own custom chips are long gone.



    I think that Apple bought out Raycer so that it could get ahold of some programmers to make Quartz Extreme. These kinds of people do not grow on trees.
  • Reply 7 of 8
    macroninmacronin Posts: 1,174member
    I was one of those who thought for the logest time that Apple was going to produce a new Apple-branded OpenGL card, based on Raycer technologies...



    I was also one of theose people who thought for the longest time that Apple was going to produce a new Apple-branded 3D DSP chip, for inclusion into the once fable UMA2 chipset, based on Raycer technologies...



    Come on, raise your hands if you remember the whole UMA1/UMA2 chipset/architecture rumours and theories...!



    But I agree with the above poster...



    Apple bought Raycer for their OpenGL engineers...



    Which shows where things have been going, and how long Quartz Extreme has been in the works...



    Steve knows exactely where he wants to position the company, and he seems intent upon taking over the DCC/Entertainment segment of the high-end markets...



    Look for an Apple Xstation, coming to the Apple Store soon!
  • Reply 8 of 8
    programmerprogrammer Posts: 3,458member
    Hardware guys are hardware guys, and software guys are software guys. There is a bit of crossover, but not much. The hardware team from Raycer is probably working on Apple silicon of some form, and based on Moki's comments, I'd guess that they are putting GPU-like units into the chipset (which Apple does create and will likely continue creating into the future). If the system software is done well (and Apple usually does) then the "chipset-GPU" and the actual ATI/nVidia GPU would both get along just fine... which gets used would depend on the details of how the application is using OpenGL & Quartz. Keep in mind that Quartz's rendering code still operates in main memory, as opposed to the Quartz Extreme compositor which is what moves and combines the images to/from VRAM for display. A "chipset-GPU" could be used for tasks which never reach the video card -- audio processing or video compression, for example.
Sign In or Register to comment.