Originally Posted by wizard69
I still have to wonder if NVidia would even be remotely interested in doing custom cards for Apple.
I've seen mention of those drivers.
PNY put out an after market Quadro 4000 for the Mac Pro. Early drivers were absolutely terrible and caused kernel panics. Supposedly the later ones were good. It outperformed other applications in some software, but it varied. OpenCL 1.1 support was added later in spite of this being an after market card. The hardware would have been identical to the Windows version with whatever driver development. A lot of people have mentioned generic NVidia cards having received some level of support as of Mountain Lion, even if they won't display boot screens. I haven't tried it. The EVGA 680 for Mac is basically the same hardware again. I don't see any reason why NVidia would be opposed to doing a custom order for Apple. If this is a pricing thing, they would have less reason to undercut their other offerings on Quadro branded cards, given how much they depend on margins there. NVidia does have Titan at $1000 with 6GB of ram. As I mentioned the framebuffer size is a big deal with these if they're leveraged for computation, as they can't rely on virtual memory. The problem either runs or it doesn't.
This whole thing with so called Pro GPUs can be confusing as often the only thing pro about the cards is some firmware tweaks and different drivers. However admittedly pro cards do offer some advantage these days such as ECC RAM. Obviously both AMD and Apple will have to agree on how these cards are branded.
That is very true. Even on Windows sometimes gaming cards do work. You just have to be careful as some are terrible in certain applications. The 5870 had a ton of Windows bugs in a lot of Autodesk apps, and it was never officially supported there or even recommended on their compatibility list. It'll say certified, recommended, and mention any known caveats found in their testing. Some of the other gaming cards were fine. The 5870 did okay in OSX for its time, but I wouldn't want to buy one today, especially not at the same price. It's normal to expect a faster card with an update, as software tends to become more aggressive in its features when the people using it are running newer hardware. I definitely think it's silly to interpret everything based on current market specs of hardware that has been available for at least a year at its current pricing. When I looked up the top 12 core mac pro option, the cpu + gpu costs were still significantly below what it would cost using the parts everyone seems to be projecting as long as the stratospheric pricing on those gpus remains. If low volume is the problem that everyone suggests with the current machine, I don't think they're going to push it even higher unless they're sure they have a market for it somewhere. If anything it needs to be significantly more competitive at the sub $3k configuration if it's going to maintain any kind of volume. This is especially true with the possible need to invest in additional das solutions when moving to the new one.
NVidia owns a lot of the workstation market, and it's always rumored that it makes up a small portion of AMD's overall revenue. If you look at what they market as their low end Firepro cards, they're based on chips of the 5870 era. I couldn't find any breakdown in their quarterly report though. I do think it could be partly a branding thing here but still focused on computation and professional graphics. The choice of branding may just be to signify the emphasis even if in terms of hardware we're looking at a Radeon with a larger framebuffer.
By the way, I did read somewhere that memory speeds were a big factor in bringing about heterogeneous computing. I suspect they want something that can maintain lockstep with the gpu portion of the APU.
I'm still wondering about a custom spin to meld well with advanced TB displays of greater than 4K resolution. I could see them tweaking the output ports for this and giving the chip a custom part number. A workstation for the 4K market would ideally be able to drive larger than 4K displays. We have heard little about new displays from Apple even though they are obviously needed. Maybe it is wishful thinking but I see a big splash coming with a complete system from Apple. In other words the Mac Pro is released with a new display option and a disk array option. The timing seems right for this to happen.
You would need larger than 4K if Apple wanted to maintain the current size of their displays and apply the same doubling at the 27" level. That would bring you to 5120 x 1880. As I've mentioned thunderbolt was really designed with portable devices in mind more than anything else. I don't particularly care for it for a number of reasons, mainly it doesn't easily shoehorn into everything, it's often more expensive at equivalent performance levels, and it's tied to intel rather than something like PCI which has its own group to set specific standards. I also hate serialized chains of equipment. The availability of 6 ports is nice as it at least alleviates the issue of things that don't play well with others on the chain or placement concerns. Displays have to go at the end of the chain. With the often proposed solution of placing storage in another room, it couldn't go on the same port as a display. Displaysport displays supposedly cannot go on the end of a thunderbolt chain either, although displayport 1.2 support would allow for daisy chaining displays as long as it supports the full standard. The wikipedia article on displayport is actually very well sourced. Displayport 1.2 was finalized December 2009. It increased bandwidth to 17.28 Gb/s. Raw bandwidth is actually higher, but there is some overhead. A lot of its functionality overlaps with thunderbolt in terms of chaining displays and implementing usb hubs.
I don't really care for closed standards in most cases. With NVidia as I mentioned the difference is that they do a lot of extra development that some of these developers are simply implementing in their own software. That is apparently how Adobe obtained their CUDA based raytracer. It's basically the same thing Apple does where OSX and certain software packages are tied to Macs. They add value to Macs and none to any machine that doesn't run Apple hardware. TB 2.0 introduces channel bonding just to get to 4K. I'm not entirely confident you'll see anything beyond that anytime soon, especially given the cost of custom development for what isn't one of the higher volume products (large displays). I was surprised just to see the amount of custom work on the imac. It was less surprising on portables given that their numbers.
The desktop display market in general has been pretty conservative for years. The manufacturers that made or at least designed extremely high quality panels, such as Hitachi, all pulled out of it. They may still be making specialized equipment. Speaking of Hitachi, they were the ones that pioneered IPS in the late 90s. In spite of its flaws it seems like it outships Samsung's PVA. PVA was nice in some ways. It just had gamma shift issues at some viewing angles. I will say that some of the non - LG branded panels had far fewer flaws, but basically everything uses LG now. That is why I kind of facedesk whenever someone says Apple should go elsewhere. Unless the elusive Sharp displays start to show up in large quantities, what are they going to use? Even 2003-2009 when others were available, Apple wasn't one of the adopters, probably due to cost and possibly required quantity.
Edited by hmm - 6/29/13 at 10:43pm