or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Confirmed: Aqua Hardware acceleration
New Posts  All Forums:Forum Nav:

Confirmed: Aqua Hardware acceleration

post #1 of 74
Thread Starter 
Check out this link for a very juicy read:

<a href="http://www.architosh.com/news/2001-11/2001a-1130-appleg5.phtml" target="_blank">http://www.architosh.com/news/2001-11/2001a-1130-appleg5.phtml</a>

Most of what's said confirms other rumors, which is very reassuring. But there is a bit of new info that is possibly more interesting than even G5 Powermacs:

"Our sources generally agree on the fact that these new machines will have much faster bus speeds (400MHz seems to be the number to bet on). However, much about these test boxes remains shrouded in mystery because these units come shipped in "sealed enclosures". Sources seem to indicate that there is something going on with respect to "graphics processing" and that the effect of this is a mystery hardware item.

"We have adequate reason to suspect that the first fruits of the Raycer Graphics buyout by Apple over two years ago may be at play here. Reportedly, some graphics functions are insanely fast (perhaps at factors of 10x - 20x). The key word here is "some" functions.
"

This mystery hardware item undoubtedly accelerates the OS X GUI (which would be "some items", namely, the quartz effects that are currently dog-slow). Everything makes sense now...by the time OS X is the default OS (March or so), it will be hardware accelerated.

The real question is, which macs will offer hardware acceleration? If this hardware is made by Apple, then it will be cheap for them to implement. So I think it's safe to say that ALL desktop Macs will get the new accelerator: the new iMacs, and the new Powermacs. This hardware acceleration also gives OS X a benefit that Windows doesn't get: it can do GUI effects that Windows cannot, because it's hardware accelerated. I wouldn't be surprised if Apple even added some effects to Aqua, just to rub it in M$'s nose!

Another part to this that makes sense is that Apple probably planned on this new hardware coming out by MWNY this year. This would explain OS X's initial slowness--Apple planned on hardware acceleration. They realized at the last minute that it wouldn't be ready, so they had to sick the OS X team on performance optimizations--optimizations that really only made OS X's performance adequate. There are parts of OS X performance that need 10-20 fold increases in speed, and even a 2 GHZ G5 will not deliver this sort of performance boost. This alone proves that Apple has SOMETHING planned before OS X is the default OS.

Another important aspect of this is that it forces old mac users to upgrade. This perfectly fits in with Apple's strategy of making users upgrade to new computers, rather than upgrade their old hardware. If OS X only screams on new hardware, then it's a hell of an incentive.

Some say this technology is the result of Apple's Raycer buyout. Maybe so. It would be in Apple's best interest to make the accelerator themselves, because it can only be used in Macs. What company is going to make a GPU that only will sell for the Mac market? A company bought by Apple, that's who.

What will be interesting is how long it takes Apple to implement this technology across their line. The desktops are a sure bet, but what about laptops? Many are predicting updates to the powerbook and iBook early next year, so perhaps we won't have to wait for long. One can imagine that if only the desktops have this accelerator, it would hurt laptops sales, but then if Apple has both new iMacs and new Powermacs, they might figure they can suffer some lowered laptop sales.



edit: disregard the speculation about a new "Aqua chip". I was having a psychotic episode when I thought that one up.

[ 12-02-2001: Message edited by: Junkyard Dawg ]</p>
post #2 of 74
reguarding the possibility of apple designing a special "aqua" chip - using the people they got from raycer to design it, well - it is VERY unlikely. Why? well, designing ASICs is not cheap. Not only is it expensive just to design the thing, but it is expensive as hell to manufacture. The price of doing an ASIC only makes sense if the economies of scale kick in, and the entire mac market doesn't justify it, not from an economical standpoint. Besides, apple in all likelyhood simply bought raycer for its engineers. Finding a good group of electrical engineers who have expierience and have worked together is difficult at best - unless you buy their employer. The "graphics processing" improvements could be a result of the 4X faster memory bus, the faster CPU, and/or the addition of drivers that take advantage of existing graphics hardware such as the ATI or NVIDIA gear that ships in all macs. IF and this is a big IF apple was designing any chip to accelerate aqua, it would not be an aqua chip - but an entire graphics subsystem: 2D, 3D, Aqua and all. This is remotely possible at the fringes of reality, but don't count on it. It would make much better business sense not to do it. Apple is not in a position to release graphics chips every 6 months, and how the hell would they deploy them? upgrade cards? and to who, mac users only? forget it. Think about the economies of scale: they would be competing with NVIDIA and ATI who have years of expierience, and a huge market compared to the mac, meaning even if apple could make chips as quick as ATI/NVIDIA, they would cost tons more because the cost of entry is huge. SO, bottom line, these pipe dreams of panacia hardware/software won't come to fruition. HOWEVER, things will most certainly get better. don't dissapoint yourselves.

-David
i freebase user interface
Reply
i freebase user interface
Reply
post #3 of 74
If they need a special GPU to make menus select and buttons throb, they've got problems.

All the Macs already have hardware graphics acceleration - why not just more efficiently use what's already there?

[edit: OK, grad student said much more eloquently what I was trying to say.]

[ 12-02-2001: Message edited by: BRussell ]</p>
post #4 of 74
Thread Starter 
[quote]The "graphics processing" improvements could be a result of the 4X faster memory bus, the faster CPU, and/or the addition of drivers that take advantage of existing graphics hardware such as the ATI or NVIDIA gear that ships in all macs. <hr></blockquote>

But these changes could not make a Mac 20X faster.

I agree with you, Grad Student, but the reports tell a different story. If people are seeing a 20x boost in speed for some GUI tasks, then it MUST be hardware acceleration at work.

I suppose an alternative explanation is that they figured out a way to offload Quartz rendering to the GPU, but according to people from ATI, this isn't possible.

So we are left with an unlikely explanation, but the only possible explanation.

Furthermore, Apple doesn't have to upgrade this chip every 6 months, unless Aqua changes. The same chip could be used for years since it's duties would never change.

And Apple would only put it in new mobos, they wouldn't let current owners upgrade. That's the whole point of it...to force people to buy new hardware.
post #5 of 74
Junkyard Dawg,

how the hell do you make a gui operation 20X faster!? How much faster can a menu drop down and how could you possibly time that? Face it. that is BS
post #6 of 74
post #7 of 74
Just because the enclosure is sealed doesn't mean there is a secret Apple-engineered graphics chip inside.

Note that the reported increase in some graphics functions is clearly stated as a guesstimate (ie "perhaps at factors of 10x - 20x"). There has been no hard measuring performed.

The drastic increase in a limited range of functions would indicate to me some form of Alti-Vec optimization.
post #8 of 74
Sounds like bullshit. Won't happen. Engineers could be good at lots of things, but they have not been put to work on a 'hardware accelerated GUI'. That's just stupid, there's nothing wrong with the way OSX.1 works on any machines sold today. By the end of 2002, new machines will run OSX very quickly, even at the bottom of the consumer line-up. To plan a hardware accelerated GUI from the beginning is even stupider: the money, complexity, and development time could easily be spent on stuff that makes a difference -- a new motherboard, faster memory and disc subsystems, faster I/O, integrating better graphics chips, testing and debugging new processor configs... etc

That Raycer 2D stuff is just bullshit. Deal with it.
IBL!
Reply
IBL!
Reply
post #9 of 74
Thread Starter 
[quote] how the hell do you make a gui operation 20X faster!? How much faster can a menu drop down and how could you possibly time that? Face it. that is BS<hr></blockquote>

Window resizing.

We'll see what happens soon enough. In the meantime, I don't think it's all that far-fetched that Apple would want to accelerate the Aqua interface, considering that lack of performance the most common complaint of OS X.
post #10 of 74
Yes, but it will be accelerated through the typical improvements in hardware. I think to a degree Apple was expecting hardware acceleration for the GUI; Not specifically mind you. What they probably expected was that the PPC would have scaled better than it did, and that by the time they released OSX, Macs in general were to have been at least 50% faster than they currently are. Imagine, if you will, that the very slowest machine in Apple's line-up had at least the performance of an 867 G4 -- Would OSX responsiveness even be in question? No, but this 867 desribes the second fastest machine they sell, when it should in fact desribe the second slowest.

Once again the problem is the state of the PPC, Apple was, reasonably so, expecting a certain level of hardware performance that they didn't get. As it is, they're about 12 months behind where they should be.
IBL!
Reply
IBL!
Reply
post #11 of 74
[quote]Originally posted by Junkyard Dawg:
<strong>I wouldn't be surprised if Apple even added some effects to Aqua, just to rub it in M$'s nose!</strong><hr></blockquote>

Well, while we're on the subject: I'd like to see that stupid, cheesy-looking, low-rent puff of smoke that appears when you drag something from the dock (how LAME is that, compared to the soft, elegant slickness of Aqua elsewhere?!?) instead be replaced by a cool fading water ripple.

I think that would be pretty cool because that smoke puff is simply awful.

Would look MUCH nicer and less amateurish...not to mention, tie in quite nicely with the whole "Aqua" vibe.
post #12 of 74
junkyard dawg:

for as long as i can remember you have been posting the most retarded threads and replies to peoples honest posting.

you are an idiot. and stop creating a new thread every 5 minutes. i look down the list of thread creators and your name repeats over and over.
post #13 of 74
Thread Starter 
Matsu, what you're saying makes perfect sense, and I believe you. But how does one explain a 20x boost in performance for only certain tasks? Even a 2 GHz G5 wouldn't give us a 20x performance boost in something.


The only way to get a 20x boost in performance is to hardware accelerate something that was not hardware accelerated before. The most obvious thing that would be accelerated is Aqua. Perhaps I'm totally wrong about it being a special chip that accelerates it, maybe Apple found a way to use the GPU to accelerate it. But there must be some explanation for such speed increases, and MHz and bus speed alone are not enough.

Even if performance scaled with MHz, a 1.6 GHz G4 is only about twice as fast as current G4s. Factor in a faster bus and memory, and maybe these new macs really will be 2x as fast, but I doubt it. So something else must explain the observed 20x boost in speed for some tasks.
post #14 of 74
How many times to I need to say it!? The "20x" improvement is anything but a hard number. Someone played around with the ui and made a baseless claim of "perhaps 10x - 20x" increase in performance.

You're taking bad data to start with and basing all kinds of wild theories around it. Would you be able to tell the difference between 10x and 2x improvement in window resizing? I sincerely doubt it. After 2x it becomes too subtle to unscientifically measure.

The speed increases are almost definitely mis-estimated. The increase is easily explained by a faster processor, motherboard, graphics chip and memory in addition to alti-vec optimizations.

Sorry if reality isn't nearly as fun or interesting as your fanciful imagination.
post #15 of 74
Thread Starter 
If they say 20x, I assume it means wicked-fast GUI, faster than even OS 9.2.

This sort of performance increase is more than 2x, more than 4x, we're talking about a fundamental change in how the system renders the GUI.

But you are right, about this speculation being based on bad data. Until we have better reports, or the Powermacs finally are announced, we won't really know what sort of performance increases there are.
post #16 of 74
quote:
" 1. Custom ASIC's would not be unheard of for Apple machines. They have used different types before and outsourced production. Why do you think their R&D budget is so large? If they only used existing stuff they wouldn't be much more innovative then DELL. If the object is handling the 2D Aqua processing the economy of scale comes from 10-15 years of consistently designed hardware and OS combinations.

2. Why buy graphics hardware engineers if not to make graphics hardware? Sure EE/CE's can do lots of different stuff, but you buy the package for it's demonstrated strengths.

3. You only compete with NVidia and ATI if you go 3D. If the chip is optimized for 2D rendering effects it is in a separate niche, one that the big boys have already told Apple they weren't interested in. Hence no current Quartz acceleration. Current accelerators (via drivers) just intercept OpenGL(or like) API calls, shuffling then to te GPU. If the effect you need isn't supported explicitly in the GPU hardware, there is no API call and no way to accelerate the code. Current Vertex/Pixel shading programmability is not able to do Aqua style processing and probably won't be for quite some time, if ever. "

ok, I'll elaborate, and excuse me for breaking it down like this AirSluf

your point 1:
you are correct, custom ASICs have been used in macs for a very long time. What are these ASICS? well, the mutherboard chipsets come to mind - and how well does apple do? consistantly behind the times - bus speeds, and the types of supported memory are at least a year or more behind what is offered in the PC world. Why do macs cost more? well, that larger R & D budget comes to mind (along with the higher profit margins), and again lower economies of scale.

2. your point 2
as far as I know, raycer never shipped a product, and in my book they have "demonstrated" nothing. They must have been worth something to Apple, and I presume it was for the engineers. again, IF apple is making a graphics chip - it would be the whole shebang, like I said. What was Raycer's focus? high end 3d graphics. Although they demonstrated nothing, if they WERE to be making a graphics chip - don't cha think it'd be 3D? and if they demonstrated any strength at all it'd be in 3d graphics.

your point 3:
lets just say it is a 2d only chip. How is it supposed to get to the frame buffer? an apple designed 3d card using NVIDIA chips that were never designed to be used with a "aqua" chip? Howabout have nvidia modify their chips to work with apple tech? thats likely. or put it on the system bus, and suffer the same problems the G4 has pushing pixels over the AGP bus? Again, if this hypothetical chip touches a pixel, it will do all the rendering: 2d, 3d, and "aqua" duties (technically 2d also).
In your point 3 you beautifully demonstrate how you are speaking about that which you do not know.
"Current accelerators (via drivers) just intercept OpenGL(or like) API calls, shuffling then to te GPU. If the effect you need isn't supported explicitly in the GPU hardware, there is no API call and no way to accelerate the code"
OK, so you want to talk about GL? well, what exactly in Aqua can't be handled by todays 3D hardware? transparency? no problem. The genie effect? simple, take the window buffer and use it as a texture onto a series of triangles that bend to the desired shape of the window. you must mean the vector nature of Quartz. Well, with rendering to memory, and the hardware behind GL evaluators, bezier curves CAN be rendered in hardware, and so can the rest of Quartz. The ONLY sticky point being some of the Anti Aliasing - but with a little clever programming, that can be done in hardware as well - especially on the GF3 generation of HW. Listen, no offense, but it is important to keep an eye on reality. Apple Quartz only chip - not gunna happen. Apple complete graphics subsistem - maybe, but I highly doubt it. 20X gains would be easy if some of the rendering were offloaded onto the 3D subsystem. I could be wrong, it has happened countless times before - but my expierience and education brought me to my conclusions, and I think that those whose expect the impossible will be greatly disappointed. so, live your lives, stop looking at this computer, and we'll all know the answer when apple ships their gear.

-David
i freebase user interface
Reply
i freebase user interface
Reply
post #17 of 74
What's the big deal about a custom ASIC? Apple has people working on processor development, and they make their own motherboards, and don't those use chipsets developed in house?
post #18 of 74
I don't know why all of you assume that this purported speed increase in the GUI is coming from some kind of off-loaded graphics accelerator. Anyone who truly understands Quartz realizes that one of its biggest bottlenecks is memory bandwidth. All that slow scrolling, slow refreshing of windows, and slow resizing is due to the fact that the graphics must be composited in main memory before they can be sent to video memory and blitted to the screen. If these new G5s (assuming that they actually exist) really do have a far higher bus speed, then this speed-up in Quartz can be a result of that.
post #19 of 74
grad student, you may want to invest in a spell checker
post #20 of 74
Thread Starter 
When you put it like that, Grad Student, it makes sense that a dedicated "Aqua chip" wouldn't work. I guess I never thought about it in that way before.

So it's more likely that Apple finally wrote Quartz to do less software based rendering, and send more of it to the GPU? Maybe that's the explanation for the 20x speed gains reported: Apple's engineers found a way to hardware accelerate Aqua.

This would give the Raycer engineers something to do, too. If they worked on graphics systems in OS X, that would probably take a team of engineers just to write the drivers for hardware acceleration, and to maintain and improve the OpenGL implementation in OS X.

Thanks for setting me straight on this one.
post #21 of 74
Ah, just because (s)he made a couple spelling errors doesn't mean that the points argued and described aren't valid. And I, for one, never had any question with what (s)he was trying to say.

I would also like to say that the economic argument is a very refreshing and interesting point of view. Contrasted with baseless speculation and blind optimism, it gives us a better idea of what Apple will do (after all, economics tries to explain firm and consumer decisions based on what is in the said entity's best interest).

Also, I'd like to point out that we as AI board members should be attacking ideas, not people.

-Ender
If you find yourself sided with the majority, it is time to change your thinking.

-Mark Twain
Reply
If you find yourself sided with the majority, it is time to change your thinking.

-Mark Twain
Reply
post #22 of 74
About all this "a 20x speed-up could only come from hadware" stuff...
One of my CS profs once got a program's running time down from 11 days to 8 hours just by changing the order of two function calls. If a 33x speedup is possible from changing "funcA(); funcB();" to "funcB();funcA();", then I'm pretty sure a 20x improvment from optimizations and/or improved algorithms is within the realm of possibility. I'm not saying that 20x boost isn't from hardware, just that it doesn't have to be.
post #23 of 74
GAAAAHHH!!!! You people just don't get it!
If a HARDWARE CHIP does z-sorting, not only does it OFFLOAD THESE FUNCTIONS FROM THE CPU (software), but it DRAMATICALLY REDUCES THE AMOUNT OF DATA that needs graphic transformations.

Useless info gets chucked out (ie occluded triangles in 3d, hidden 2d data in 2d), and the data that remains has hardware z-sorting for transparency!

Sure the latest Macs run Aqua fine... in software. I checked out the 600mhz iBook and was impressed with its snappiness and clarity. But the point is you are using cpu cycles to do this... it's like playing Quake 1 (software rendering) VS Quake ||| on a Geforce 3 Ti.

This isn't just 'hardware accel. for Aqua', this is an advanced sorting algorithm implemented in hardware, it can be used for many things... like... oh, I don't know... 3D!!!! As in Maya! As in OpenGL, that mysterious API apple included at THE CORE of OSX.
No, the bazaar cannot satisfy users. Neither can the cathedral. Nothing can satisfy users, because software is written to enable rather than satisfy, and because most users are mewling malcontents...
Reply
No, the bazaar cannot satisfy users. Neither can the cathedral. Nothing can satisfy users, because software is written to enable rather than satisfy, and because most users are mewling malcontents...
Reply
post #24 of 74
And, for the sake of clarity, this is NOT a graphics card. This is not a chip ON a graphics card. This is a co-processor. Like an FPU. Like a GPU. Like and integer unit, or MMU. It is a z-sorting unit.

If a graphics card can produce 40 million triangles per second, and it is rendering ONLY the visible polygons, and offloading all this work from the CPU... surely you can see what this means. 3D graphics like NO OTHER PLATFORM.

The fact that this can also speed up Aqua, and other computations, is a swell side-effect.

Graphics workstations, people. And the ultimate gaming platform.
No, the bazaar cannot satisfy users. Neither can the cathedral. Nothing can satisfy users, because software is written to enable rather than satisfy, and because most users are mewling malcontents...
Reply
No, the bazaar cannot satisfy users. Neither can the cathedral. Nothing can satisfy users, because software is written to enable rather than satisfy, and because most users are mewling malcontents...
Reply
post #25 of 74
[quote]as far as I know, raycer never shipped a product, and in my book they have "demonstrated" nothing. They must have been worth something to Apple, and I presume it was for the engineers. <hr></blockquote>

This was like a big topic .. a year ago. I've forgotten most of the specifics... but this is what I remember.

There was a lot of debate weither Apple aquired Raycer for their technology or their engineers. I know for a fact that one of Raycers engineers went on to be the lead engineer for the next generation iMac ... this was actually on his website, along with his resume - it was quite impressive. I believe most of the other engineers have already left Apple.

Even though Raycer never actually developed anything ... they did hold a number of patents which Apple has scored with their purchase. These patents are in regards to 3D rendering. When an oboject is rendered in 3D, it is made of tiny poligons. What raycer focused on was a chip which would only render the polygons which were visible on the monitor, as opposed to the entire object, thus vastly increasing rendering times.

There is only one other patent which applies to this method of rendering 3D which was bought by someone like Intel, a couple of months before Apple purchased Raycer. Weither Apple will ever actually use the technology is one thing, however, they stand to gain quite a bit, if ATI or nVidia, chose to use this technology via Apple's patent.

You also have to remember that this purchase happened at a time when relations where tense between Apple and ATI. Even though nVidia is now in the game, Apple probably doesn't feel like it can depend on either company a whole lot.

You mix this in with the fact that Mr. Jobs owns Pixar, as well as high end 3D being an area Apple would very much like to penetrate, and I would certainly not count anything out in terms of Apple creating it's own 3D acceleration chips weither it would mean competing with 3D companies or not.
post #26 of 74
As for the probability of a new 2D graphics chip, I'm just as clueless as the next guy. However, as for the question of whether Apple (or anyone) can do a custom ASIC, I can comment on that.

Yes, ASICs are expensive, but companies large and small design them all the time... layout and fabrication is almost always handed over to the experts at IBM, Toshiba, LSI Logic, etc. If you look at one of the chips my company made, they all say "Marconi" or "Fore Systems" on them, but they were all made by one of the above companies.

When I worked at Alcatel, we designed ASICs all the time... we had a whole department for it.

Alcatel may have been big, but it is totally within the reach of even a small company (like Fore Systems was a few years ago) to design their own ASICs. For Apple, it's a drop in the R&D bucket. All they need is a hardware description language like Verilog or VHDL in which to design the functionality and a contract with an ASIC fab to lay it out and build it.
-gEEk

"We're sorry; the number you have dialed is imaginary. Please rotate your phone 90 degrees and try your call again."
Reply
-gEEk

"We're sorry; the number you have dialed is imaginary. Please rotate your phone 90 degrees and try your call again."
Reply
post #27 of 74
Besides which, although I am too lazy to type 'ASIC' into a search engine, isn't pangea/uni north/ etc custom designed chips by Apple?
No, the bazaar cannot satisfy users. Neither can the cathedral. Nothing can satisfy users, because software is written to enable rather than satisfy, and because most users are mewling malcontents...
Reply
No, the bazaar cannot satisfy users. Neither can the cathedral. Nothing can satisfy users, because software is written to enable rather than satisfy, and because most users are mewling malcontents...
Reply
post #28 of 74
[quote]Originally posted by stimuli:
<strong>Besides which, although I am too lazy to type 'ASIC' into a search engine, isn't pangea/uni north/ etc custom designed chips by Apple?</strong><hr></blockquote>

I believe so, stimuli... and an excellent example!

Also, I forgot to mention that custom ASICs actually become quite cheap if you buy a few hundred thousand of them.... couple of bucks each after the NRE is paid (Non Recoverable Engineering costs--typically a few hundred thousand $$)
-gEEk

"We're sorry; the number you have dialed is imaginary. Please rotate your phone 90 degrees and try your call again."
Reply
-gEEk

"We're sorry; the number you have dialed is imaginary. Please rotate your phone 90 degrees and try your call again."
Reply
post #29 of 74
yes, apple can afford to make an ASIC, yes they have done it before. The motherboard chipsets mentioned are examples of this, but these chips HAD to be made. There would be no way to make a macintosh as we know it without apple designing these chips. Equivilent chips are not available off the shelf. Also, an ASIC specifically for Quartz + the NVIDIA or ATI hardware would add a lot of expense to the resulting macintosh. There are also the technical difficulties of making a separate 2D chip integrate with the 3D subsystem. My argument stands about a separate 2d and 3d chip - even if this approach were possible - would it justify the ADDED expense to the resulting mac? so.... if apple were to make the whole graphics subsystem (2d+3d: and they are capable...) that might make sense, but I doubt it for economical reasons, and because Apple is not well suited to compete with NVIDIA or ATI....
i freebase user interface
Reply
i freebase user interface
Reply
post #30 of 74
"custom ASICs actually become quite cheap if you buy a few hundred thousand of them.... couple of bucks each after the NRE is paid (Non Recoverable Engineering costs--typically a few hundred thousand $$)"

what you describe is indeed true about ASICs, that as economies of scale (a phrase I seem to be using a lot lately...) increase, the cost per chip decreases. However, when you say a couple of bucks each after the NRE is paid, that is misleading... Intel has huge economies of scale, but do you actually think that a P4 costs a couple of bucks each even with the scale they have? of course not. The cost per chip is (NRE + manufactureing cost per chip * # of chips) / # of chips. so, to say a couple of bucks is quite an oversimplification... for simple/small ASICS with big scale - sure, but in general it isn't that easy...
i freebase user interface
Reply
i freebase user interface
Reply
post #31 of 74
The Raycer thing sounds a lot like the wild speculation that was going around these boards 6 months ago...

Come to think of it, so does the G5 towers in January/Apollo in Powerbooks/iMacs stuff.

I still say we're getting G4 Apollo towers in January, but I'd love to be pleasantly suprised.
post #32 of 74
Why is it people think the new Mac will have a 400MHz bus? The only thing I know of that does is Intel, and that's only with RDRAM. Do you really expect to see Apple adopting Rambus?

I expect PC2100 DDR, which would be with a 266MHz bus.
post #33 of 74
Who says that this little GPU is trying to compete with nVidia/ATi?!?

Why couldn't this thing just be an augmentation to the G4/5 processor and the main video card?

Apple is not stupid enough to try to create their own video cards to compete with Ati nVidia when they could just buy them from the companies like they are doing now.

I don't see this chip being upgraded often, either. It's not like you'll have to push out Apple-celerator 2 64MB DDR in the next year to compete - there's nothing to compete with if you're still using current nVidia hardware in addition.

Maybe Apple forsees that 3d processors may become the next floating point add-ons for processors . . .


Apple is trying to add more exclusive benefits. They started with Software, and are now moving on to hardware.
Seems pretty logical to me.
post #34 of 74
Stimuli, you've written the first posts in quite a long time to get me all hot and bothered. For once, I really am setting myself up for disappointment come January.

PS: would this sort of thing affect Photoshop use/speed?
post #35 of 74
A little bit of math.

Apple sells roughly 4 million Macs per year (all included).

So, if Apple were to build an ASIC which went into all 4 million of them, the cost of the ASIC would be:

Assume NRE of $1,000,000 for the sake of argument.
Assume the manufacturing cost per chip is $10.
Assume the number of chips produced is 4,000,000.

Then we get:
($1,000,000 + ($10)*4,000,000)/4,000,000
=
$10.25 each

You can play with these numbers all day- make your NRE costs $10m instead of $1m. Make your manufacturing cost $20 per chip. Drop the number of chips produced to 1m instead of 4m.

($10,000,000 + ($20)*1,000,000)/1,000,000
=
$30 each

The point is, in the kind of volumes that Apple can provide, their cost of producing custom ASICs is cheap compared to the total cost of each Mac sold. Even for the $799 iMac, a $30 chip represents less than 4% of the total system cost.

Just some things to think about,

-HOS
You are in a twisty maze of passages, all alike.
Reply
You are in a twisty maze of passages, all alike.
Reply
post #36 of 74
Thread Starter 
Now you guys got me thinking again!

I'm skeptical, but since I don't understand the technological limitations I have to go with what other people say.

IF Apple was able to implement such a chip, I believe this would be an amazingly good move. If Macs could free the CPU from doing GUI-related tasks, then OS X would suddenly scream. The cool thing about this hypothetical chip is that it would allow Apple to do things with their GUI that Wintels cannot do.

This would also make Macs SEEM much faster than they really are. A person's feel for the speed of a computer is often based not on it's actual speed, but on the speed and responsiveness of it's GUI. If Macs had a seperate chip accelerating the GUI, then suddenly even a 500 MHz iMac would seem as fast or faster than a 2 GHz Wintel dinosaur.

Or something. It's late and I'm running out of steam. Catch you all later..
post #37 of 74
There is another possibility that has been discussed before. Maybe the Aqua interface is not accelerated by a Raycer chip but simply by new nVidia hardware. But, what about a Power Mac with an ATI card ? In fact, the G5 can have a soldered nVidia unit built into them. The enclosures are sealed perhaps because there isn't any graphic card in the AGP port... There is an integrated nVidia nForce chipset with DDR-SDRAM on the motherboard, all parts connected through HyperTransport. The chip could be a GeForce3 Ti 200 or even the incoming NV17 (GeForce3 MX) though I thing this one is more intended to the next LCD iMac.
post #38 of 74
Actually with what Stimuli said, A G5 could have a nForce chipset AND a Raycer coprocessor near or inside itself.
post #39 of 74
[quote]Originally posted by HOS:
<strong>A little bit of math.

Apple sells roughly 4 million Macs per year (all included).

So, if Apple were to build an ASIC which went into all 4 million of them, the cost of the ASIC would be:

[blah...blah...blah]

$30 each

[ta da! ...]

Just some things to think about,
</strong><hr></blockquote>

I always knew I was in the wrong industry.

In automotive, we buy micros for $1-2 a piece, and people like Moto laugh at us because we don't get the volumes like in the computer industry.

We sell entire control modules, like for airbags, for less than $30 a pop. And our quality control for these things are almost as good as M$ or Apple's for their OS (no-- that part's a joke).

This nebulous graphics chip doesn't sound out of the ordinary in terms of manufacturing. If the design is done by Apple, so much the better for the ASIC supplier. If I was at Apple's purchasing dept., I would be shooting for the $5-10 range, while demanding $2.50. At least, that's what customers do to us.

BTW: grad student-- Apple also has the highest margins in the computer industry (&gt;20%), slightly higher than the 0.01% we make in the automotive industry (that's also a joke, in more ways than one). That adds to consumer costs, and is chicken/egg to their R&D costs.

Sorry wandering-- gotta get off caffeine.

~e
Ich bin a big jelly donut.
Reply
Ich bin a big jelly donut.
Reply
post #40 of 74
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Confirmed: Aqua Hardware acceleration