Originally Posted by JeffDM
I really don't know about computer board production, though I do design very simple circuit boards from time to time. How much of a custom card is it, really? Could it be that they just put on a larger firmware chip? If it's a simple variation of an existing board, I don't see that part being expensive, change a part in the pick&place machine mid-stream for a batch and be done it. The part I see being expensive is developing and maintaining said firmware that's only used on one computer that might not get GPU upgrades. It doesn't help that there's no competition for Mac boards either, so the maker can charge what the market will bear.
As someone who has designed complex multilayer cards, I can say that modern graphics boards are fairly complex. While most cards are designed using computer programs, a good deal of human intervention is required. Also, it's rare that the first iteration, or even the second, is usable. Each new board must be designed from scratch. Even changing a half dozen traces can cause a total re design.
I agree with Onlooker, and yourself here on the number of boards sold. I've also been saying that the very small number is what prevents others from bothering to get into the market. EFI, though, has nothing to do with it.
If enough boards could be sold, then ATI, at least, would be in the Mac market, as they had been for years after every other manufacturer had abandoned it.
What we know of Mc Pro sales is at the last estimate, a year ago, was that Apple was selling no more than about 125,000 machines a quarter.
With most people buying the standard, least expensive card, that left Apple to sell very few higher grade cards.
What manufacturer would be crazy enough to try to enter that tiny market?
I'm willing to bet that Apple doesn't sell more than about 100,000 higher end cards this year for the Mac Pro's.
Even if ATi, or another did enter the market with a lower price than the $350 for the 8800 GT, many people would still buy Apple's card, because it's easy to do so, and for the money they are paying for the machine as a whole, perhaps including a new monitor, etc, the $200 upgrade price isn't much to care about.
As someone from a company who used to buy Macs, I can attest that for most companies, having the entire machine warranted by Apple, rather than having to look to different companies when something goes wrong, is often worth paying a bit more money up front.
Companies hate buying a machine with a card, then replacing that card with t third party card. first, the money spent on the card is wasted. Second, if you have a problem, the first thing you will be told is to put the original card back. If the card is bad, that's fine.
But, too often, there is some subtle problem that can only be understood, or resolved, with the bought card in place.
The same thing is true for third party memory. You'd be surprised as how many companies will buy machines with Apple's memory, BECAUSE it is Apple's memory, and warranted with the entire machine. Apple is responsible.
I know the post sort of got off track, but I just wanted to illustrate why some of these things happen.
If Apple followed my idea for a graphic card in the top line 24" iMac, there might be several times as many machines sold each quarter that accepts cards, for Apple to lower its card prices, and for room for others.