[quote]Originally posted by grad student:
" 1. Custom ASIC's would not be unheard of for Apple machines. They have used different types before and outsourced production. Why do you think their R&D budget is so large? If they only used existing stuff they wouldn't be much more innovative then DELL. If the object is handling the 2D Aqua processing the economy of scale comes from 10-15 years of consistently designed hardware and OS combinations.
2. Why buy graphics hardware engineers if not to make graphics hardware? Sure EE/CE's can do lots of different stuff, but you buy the package for it's demonstrated strengths.
3. You only compete with NVidia and ATI if you go 3D. If the chip is optimized for 2D rendering effects it is in a separate niche, one that the big boys have already told Apple they weren't interested in. Hence no current Quartz acceleration. Current accelerators (via drivers) just intercept OpenGL(or like) API calls, shuffling then to te GPU. If the effect you need isn't supported explicitly in the GPU hardware, there is no API call and no way to accelerate the code. Current Vertex/Pixel shading programmability is not able to do Aqua style processing and probably won't be for quite some time, if ever. "
ok, I'll elaborate, and excuse me for breaking it down like this AirSluf
your point 1:
you are correct, custom ASICs have been used in macs for a very long time. What are these ASICS? well, the mutherboard chipsets come to mind - and how well does apple do? consistantly behind the times - bus speeds, and the types of supported memory are at least a year or more behind what is offered in the PC world. Why do macs cost more? well, that larger R & D budget comes to mind (along with the higher profit margins), and again lower economies of scale.
2. your point 2
as far as I know, raycer never shipped a product, and in my book they have "demonstrated" nothing. They must have been worth something to Apple, and I presume it was for the engineers. again, IF apple is making a graphics chip - it would be the whole shebang, like I said. What was Raycer's focus? high end 3d graphics. Although they demonstrated nothing, if they WERE to be making a graphics chip - don't cha think it'd be 3D? and if they demonstrated any strength at all it'd be in 3d graphics.
your point 3:
lets just say it is a 2d only chip. How is it supposed to get to the frame buffer? an apple designed 3d card using NVIDIA chips that were never designed to be used with a "aqua" chip? Howabout have nvidia modify their chips to work with apple tech? thats likely. or put it on the system bus, and suffer the same problems the G4 has pushing pixels over the AGP bus? Again, if this hypothetical chip touches a pixel, it will do all the rendering: 2d, 3d, and "aqua" duties (technically 2d also).
In your point 3 you beautifully demonstrate how you are speaking about that which you do not know.
"Current accelerators (via drivers) just intercept OpenGL(or like) API calls, shuffling then to te GPU. If the effect you need isn't supported explicitly in the GPU hardware, there is no API call and no way to accelerate the code"
OK, so you want to talk about GL? well, what exactly in Aqua can't be handled by todays 3D hardware? transparency? no problem. The genie effect? simple, take the window buffer and use it as a texture onto a series of triangles that bend to the desired shape of the window. you must mean the vector nature of Quartz. Well, with rendering to memory, and the hardware behind GL evaluators, bezier curves CAN be rendered in hardware, and so can the rest of Quartz. The ONLY sticky point being some of the Anti Aliasing - but with a little clever programming, that can be done in hardware as well - especially on the GF3 generation of HW. Listen, no offense, but it is important to keep an eye on reality. Apple Quartz only chip - not gunna happen. Apple complete graphics subsistem - maybe, but I highly doubt it. 20X gains would be easy if some of the rendering were offloaded onto the 3D subsystem. I could be wrong, it has happened countless times before - but my expierience and education brought me to my conclusions, and I think that those whose expect the impossible will be greatly disappointed. so, live your lives, stop looking at this computer, and we'll all know the answer when apple ships their gear.
Grad student your excellent thread is not coming from a graded student, but from a doctor honoris causa !