Confirmed: Aqua Hardware acceleration

24

Comments

  • Reply 21 of 73
    whisperwhisper Posts: 735member
    About all this "a 20x speed-up could only come from hadware" stuff...

    One of my CS profs once got a program's running time down from 11 days to 8 hours just by changing the order of two function calls. If a 33x speedup is possible from changing "funcA(); funcB();" to "funcB();funcA();", then I'm pretty sure a 20x improvment from optimizations and/or improved algorithms is within the realm of possibility. I'm not saying that 20x boost isn't from hardware, just that it doesn't have to be.
  • Reply 22 of 73
    stimulistimuli Posts: 564member
    GAAAAHHH!!!! You people just don't get it!

    If a HARDWARE CHIP does z-sorting, not only does it OFFLOAD THESE FUNCTIONS FROM THE CPU (software), but it DRAMATICALLY REDUCES THE AMOUNT OF DATA that needs graphic transformations.



    Useless info gets chucked out (ie occluded triangles in 3d, hidden 2d data in 2d), and the data that remains has hardware z-sorting for transparency!



    Sure the latest Macs run Aqua fine... in software. I checked out the 600mhz iBook and was impressed with its snappiness and clarity. But the point is you are using cpu cycles to do this... it's like playing Quake 1 (software rendering) VS Quake ||| on a Geforce 3 Ti.



    This isn't just 'hardware accel. for Aqua', this is an advanced sorting algorithm implemented in hardware, it can be used for many things... like... oh, I don't know... 3D!!!! As in Maya! As in OpenGL, that mysterious API apple included at THE CORE of OSX.
  • Reply 23 of 73
    stimulistimuli Posts: 564member
    And, for the sake of clarity, this is NOT a graphics card. This is not a chip ON a graphics card. This is a co-processor. Like an FPU. Like a GPU. Like and integer unit, or MMU. It is a z-sorting unit.



    If a graphics card can produce 40 million triangles per second, and it is rendering ONLY the visible polygons, and offloading all this work from the CPU... surely you can see what this means. 3D graphics like NO OTHER PLATFORM.



    The fact that this can also speed up Aqua, and other computations, is a swell side-effect.



    Graphics workstations, people. And the ultimate gaming platform.
  • Reply 24 of 73
    [quote]as far as I know, raycer never shipped a product, and in my book they have "demonstrated" nothing. They must have been worth something to Apple, and I presume it was for the engineers. <hr></blockquote>



    This was like a big topic .. a year ago. I've forgotten most of the specifics... but this is what I remember.



    There was a lot of debate weither Apple aquired Raycer for their technology or their engineers. I know for a fact that one of Raycers engineers went on to be the lead engineer for the next generation iMac ... this was actually on his website, along with his resume - it was quite impressive. I believe most of the other engineers have already left Apple.



    Even though Raycer never actually developed anything ... they did hold a number of patents which Apple has scored with their purchase. These patents are in regards to 3D rendering. When an oboject is rendered in 3D, it is made of tiny poligons. What raycer focused on was a chip which would only render the polygons which were visible on the monitor, as opposed to the entire object, thus vastly increasing rendering times.



    There is only one other patent which applies to this method of rendering 3D which was bought by someone like Intel, a couple of months before Apple purchased Raycer. Weither Apple will ever actually use the technology is one thing, however, they stand to gain quite a bit, if ATI or nVidia, chose to use this technology via Apple's patent.



    You also have to remember that this purchase happened at a time when relations where tense between Apple and ATI. Even though nVidia is now in the game, Apple probably doesn't feel like it can depend on either company a whole lot.



    You mix this in with the fact that Mr. Jobs owns Pixar, as well as high end 3D being an area Apple would very much like to penetrate, and I would certainly not count anything out in terms of Apple creating it's own 3D acceleration chips weither it would mean competing with 3D companies or not.
  • Reply 25 of 73
    geekgeek Posts: 14member
    As for the probability of a new 2D graphics chip, I'm just as clueless as the next guy. However, as for the question of whether Apple (or anyone) can do a custom ASIC, I can comment on that.



    Yes, ASICs are expensive, but companies large and small design them all the time... layout and fabrication is almost always handed over to the experts at IBM, Toshiba, LSI Logic, etc. If you look at one of the chips my company made, they all say "Marconi" or "Fore Systems" on them, but they were all made by one of the above companies.



    When I worked at Alcatel, we designed ASICs all the time... we had a whole department for it.



    Alcatel may have been big, but it is totally within the reach of even a small company (like Fore Systems was a few years ago) to design their own ASICs. For Apple, it's a drop in the R&D bucket. All they need is a hardware description language like Verilog or VHDL in which to design the functionality and a contract with an ASIC fab to lay it out and build it.
  • Reply 26 of 73
    stimulistimuli Posts: 564member
    Besides which, although I am too lazy to type 'ASIC' into a search engine, isn't pangea/uni north/ etc custom designed chips by Apple?
  • Reply 27 of 73
    geekgeek Posts: 14member
    [quote]Originally posted by stimuli:

    <strong>Besides which, although I am too lazy to type 'ASIC' into a search engine, isn't pangea/uni north/ etc custom designed chips by Apple?</strong><hr></blockquote>



    I believe so, stimuli... and an excellent example!



    Also, I forgot to mention that custom ASICs actually become quite cheap if you buy a few hundred thousand of them.... couple of bucks each after the NRE is paid (Non Recoverable Engineering costs--typically a few hundred thousand $$)
  • Reply 28 of 73
    yes, apple can afford to make an ASIC, yes they have done it before. The motherboard chipsets mentioned are examples of this, but these chips HAD to be made. There would be no way to make a macintosh as we know it without apple designing these chips. Equivilent chips are not available off the shelf. Also, an ASIC specifically for Quartz + the NVIDIA or ATI hardware would add a lot of expense to the resulting macintosh. There are also the technical difficulties of making a separate 2D chip integrate with the 3D subsystem. My argument stands about a separate 2d and 3d chip - even if this approach were possible - would it justify the ADDED expense to the resulting mac? so.... if apple were to make the whole graphics subsystem (2d+3d: and they are capable...) that might make sense, but I doubt it for economical reasons, and because Apple is not well suited to compete with NVIDIA or ATI....
  • Reply 29 of 73
    "custom ASICs actually become quite cheap if you buy a few hundred thousand of them.... couple of bucks each after the NRE is paid (Non Recoverable Engineering costs--typically a few hundred thousand $$)"



    what you describe is indeed true about ASICs, that as economies of scale (a phrase I seem to be using a lot lately...) increase, the cost per chip decreases. However, when you say a couple of bucks each after the NRE is paid, that is misleading... Intel has huge economies of scale, but do you actually think that a P4 costs a couple of bucks each even with the scale they have? of course not. The cost per chip is (NRE + manufactureing cost per chip * # of chips) / # of chips. so, to say a couple of bucks is quite an oversimplification... for simple/small ASICS with big scale - sure, but in general it isn't that easy...
  • Reply 30 of 73
    mspmsp Posts: 40member
    The Raycer thing sounds a lot like the wild speculation that was going around these boards 6 months ago...



    Come to think of it, so does the G5 towers in January/Apollo in Powerbooks/iMacs stuff.



    I still say we're getting G4 Apollo towers in January, but I'd love to be pleasantly suprised.
  • Reply 31 of 73
    Why is it people think the new Mac will have a 400MHz bus? The only thing I know of that does is Intel, and that's only with RDRAM. Do you really expect to see Apple adopting Rambus?



    I expect PC2100 DDR, which would be with a 266MHz bus.
  • Reply 32 of 73
    cindercinder Posts: 381member
    Who says that this little GPU is trying to compete with nVidia/ATi?!?



    Why couldn't this thing just be an augmentation to the G4/5 processor and the main video card?



    Apple is not stupid enough to try to create their own video cards to compete with Ati nVidia when they could just buy them from the companies like they are doing now.



    I don't see this chip being upgraded often, either. It's not like you'll have to push out Apple-celerator 2 64MB DDR in the next year to compete - there's nothing to compete with if you're still using current nVidia hardware in addition.



    Maybe Apple forsees that 3d processors may become the next floating point add-ons for processors . . .





    Apple is trying to add more exclusive benefits. They started with Software, and are now moving on to hardware.

    Seems pretty logical to me.
  • Reply 33 of 73
    buonrottobuonrotto Posts: 6,368member
    Stimuli, you've written the first posts in quite a long time to get me all hot and bothered. For once, I really am setting myself up for disappointment come January.



    PS: would this sort of thing affect Photoshop use/speed?
  • Reply 34 of 73
    hoshos Posts: 31member
    A little bit of math.



    Apple sells roughly 4 million Macs per year (all included).



    So, if Apple were to build an ASIC which went into all 4 million of them, the cost of the ASIC would be:



    Assume NRE of $1,000,000 for the sake of argument.

    Assume the manufacturing cost per chip is $10.

    Assume the number of chips produced is 4,000,000.



    Then we get:

    ($1,000,000 + ($10)*4,000,000)/4,000,000

    =

    $10.25 each



    You can play with these numbers all day- make your NRE costs $10m instead of $1m. Make your manufacturing cost $20 per chip. Drop the number of chips produced to 1m instead of 4m.



    ($10,000,000 + ($20)*1,000,000)/1,000,000

    =

    $30 each



    The point is, in the kind of volumes that Apple can provide, their cost of producing custom ASICs is cheap compared to the total cost of each Mac sold. Even for the $799 iMac, a $30 chip represents less than 4% of the total system cost.



    Just some things to think about,



    -HOS
  • Reply 35 of 73
    Now you guys got me thinking again!



    I'm skeptical, but since I don't understand the technological limitations I have to go with what other people say.



    IF Apple was able to implement such a chip, I believe this would be an amazingly good move. If Macs could free the CPU from doing GUI-related tasks, then OS X would suddenly scream. The cool thing about this hypothetical chip is that it would allow Apple to do things with their GUI that Wintels cannot do.



    This would also make Macs SEEM much faster than they really are. A person's feel for the speed of a computer is often based not on it's actual speed, but on the speed and responsiveness of it's GUI. If Macs had a seperate chip accelerating the GUI, then suddenly even a 500 MHz iMac would seem as fast or faster than a 2 GHz Wintel dinosaur.



    Or something. It's late and I'm running out of steam. Catch you all later..
  • Reply 36 of 73
    primprim Posts: 33member
    There is another possibility that has been discussed before. Maybe the Aqua interface is not accelerated by a Raycer chip but simply by new nVidia hardware. But, what about a Power Mac with an ATI card ? In fact, the G5 can have a soldered nVidia unit built into them. The enclosures are sealed perhaps because there isn't any graphic card in the AGP port... There is an integrated nVidia nForce chipset with DDR-SDRAM on the motherboard, all parts connected through HyperTransport. The chip could be a GeForce3 Ti 200 or even the incoming NV17 (GeForce3 MX) though I thing this one is more intended to the next LCD iMac.
  • Reply 37 of 73
    primprim Posts: 33member
    Actually with what Stimuli said, A G5 could have a nForce chipset AND a Raycer coprocessor near or inside itself.
  • Reply 38 of 73
    [quote]Originally posted by HOS:

    <strong>A little bit of math.



    Apple sells roughly 4 million Macs per year (all included).



    So, if Apple were to build an ASIC which went into all 4 million of them, the cost of the ASIC would be:



    [blah...blah...blah]



    $30 each



    [ta da! ...]



    Just some things to think about,

    </strong><hr></blockquote>



    I always knew I was in the wrong industry.



    In automotive, we buy micros for $1-2 a piece, and people like Moto laugh at us because we don't get the volumes like in the computer industry.



    We sell entire control modules, like for airbags, for less than $30 a pop. And our quality control for these things are almost as good as M$ or Apple's for their OS (no-- that part's a joke).



    This nebulous graphics chip doesn't sound out of the ordinary in terms of manufacturing. If the design is done by Apple, so much the better for the ASIC supplier. If I was at Apple's purchasing dept., I would be shooting for the $5-10 range, while demanding $2.50. At least, that's what customers do to us.



    BTW: grad student-- Apple also has the highest margins in the computer industry (&gt;20%), slightly higher than the 0.01% we make in the automotive industry (that's also a joke, in more ways than one). That adds to consumer costs, and is chicken/egg to their R&D costs.



    Sorry wandering-- gotta get off caffeine.



    ~e
  • Reply 39 of 73
    airslufairsluf Posts: 1,861member
  • Reply 40 of 73
    sinewavesinewave Posts: 1,074member
    [quote]Originally posted by pscates:

    <strong>



    Well, while we're on the subject: I'd like to see that stupid, cheesy-looking, low-rent puff of smoke that appears when you drag something from the dock (how LAME is that, compared to the soft, elegant slickness of Aqua elsewhere?!?) instead be replaced by a cool fading water ripple.



    I think that would be pretty cool because that smoke puff is simply awful.



    Would look MUCH nicer and less amateurish...not to mention, tie in quite nicely with the whole "Aqua" vibe.</strong><hr></blockquote>





    You can change that yourself.. I have.
Sign In or Register to comment.