Originally Posted by hmm
Yes I've gathered that. I've followed both for a very long time. NVidia overall has been really aggressive in expanding their market as you pointed out. I don't see that as a bad thing. I think it was only logical that they tuned things for their own hardware. They wanted to squeeze as much performance and stability as possible out of it. I see no motivation for such a company to do the major R&D work to get an open standard off the ground. They don't sell as many, but the gaming cards tend to absorb a lot of development costs.
I just view it as if your software relies on CUDA, you probably stick to those years when purchasing new machines unless the software changes to accommodate similar functionality with OpenCL and AMD drivers.
The problem with the flip flopping is that it is hard on customers and even developers. Further it looks like Apple does this for some incalculable reason to avoid a long linkage to anyone company. In a nut shell the flops don't appear to be performance based, so I really don't know why we get a flop almost every other year.
The good thing for the low end guys is that this problem is effectively gone. They will have Intel integrated GPUs and that is about it. Unless of course for some reason they implement an AMD chip with integrated graphics.
I get why you're saying that. My point was more that on a typical 3-4 year replacement cycle, if the software supports CUDA today, it's unlikely to dry up within that time. If you're a developer, it's entirely different.
It won't dry up but if in three years your choices are all AMD based then you are screwed. Speaking of which I don't believe that NVidia has a choice with respect to high performance GPU cards and aggressive marketing into specialty industries. The market for all of those low cost GPUs will dry up slowly until the high end is about all you have as a GPU manufacture. Due to this I don't see NVidias high end chips getting cheaper but rather the opposite. Limited markets and low prices won't lead to NVidia being around long.
Sometimes I see the possibility if AMD trying to crush NVidia with this partnership with Apple. However even AMD will have issues selling discrete GPUs and thus will have to try to maintain pricing on high end cards. This whole issue with respect to Mac Pro pricing is very perplexing.
I have mentioned that. It's annoying to me, and I'm not entirely sure whether it's an issue of growing pains, sourcing staff, or whatever else. Apple certainly has the money to hire people, but it's rarely a simple issue when it comes to bringing something that has fallen behind back up to speed.
It is looking like Mavericks irons a lot of this out. I'd be embarrassed if I had to try to sell Apple hardware into a technical market as there is no software "backup" for that hardware. The state of drivers has been laughable.
You mention recent AMD support here. We still haven't seen that on a shipping product in the wild. The demo looked cool, and I can appreciate the demanding software they ran on it. I'm still interested in the overall results when it comes to a broader set of applications and real world testing as well as the updated alignment of price to performance. It loses some of the prior functionality that I mentioned. I liked extra bays, that don't have to deal with additional hardware layers. There's no need to deal with host cards unless you want something like SAS out and no concerns about glitch port multiplier firmware . From a practical standpoint, it's likely to be less customizable than prior models. I was going to throw in a 4k output reference, but I noticed it's supported by some of the after market cards for the current model.
Until my heart is crushed by outlandish pricing is pretty bullish on this new Mac Pro. It really could be a corporate success story given MicroSofts fast decline.
As to the GPUs, if you can dig through AMDs terrible web site you should be able to find some technical details on the FirePros. The data rates the cards are capable of are limited by Display Port. AMD specifically says they can handle data rates much higher than the Display Port standard. So I'm wondering what AMD and Apple have been up to the last couple of years. I'm thinking a new video monitor with more than 4K resolution. Apple really has few details about the video capabilities of the machine.
I wasn't so much saying maximum power. The W5000s are based on a low end card, lower than the 5770s. The drivers and possibly firmware are obviously different. Ram might be ECC. It's not that great of a card overall, in spite of its price and marketing.
This got me thinking, would AMD and Apple go to all of this trouble and then develop drivers for old architectures? It really doesn't make sense now. Which makes me wonder if AMD might debut new FirePro chips about the time this machine ships. As much as Apple has announced this machine they really haven't said much about it.
I guess it depends on where these machines start. I am fully expecting some price updates from AMD by the time these machines ship. It isn't realistic to me to assume that a mac pro is going to cram $6k worth of gpus. That would be a higher hardware cost than they have ever addressed. I'm sure there's a market, just not a sustainable one, as configurations that cost that much are mostly available through cto options from other oems. Even on the W9000s, they're still at launch pricing currently. AMD and NVidia both tend to adjust mid cycle on longer cycles rather than rebrand workstation gpus with adjusted clock rates, and workstation cards show up after gaming cards. That means these have to last at least another year, but I don't think they'll stay at $3000 retail that entire time.
I would expect AMD and Apple to stay with the GCN based cards to minimize driver issues. This gives them a number of performance levels to choose from. The problem with these "cards" is that the cards won't be in Apples machine. So it is very difficult to judge pricing. There is an intrinsic value in the parts on the card and a profit. The problem is I'm not certain what those numbers actually are. I have a hard time believing that these $3000 cards have even $1500 worth of parts. If that. I could see the cost to Apple being close to that of an equivalent desktop card.
Having used it for a bit now, I like Python's structure, although the way they update it is weird where you have a lot of stuff still on Python 2, and elements of Python 3 backported in the last couple releases.
I really like Python, it fits into my minor role when it comes to programming. You are right about the Python 3 transition, it could have been done better.
I know absolutely nothing about Fortran, so there's no way I could put together a valid response on that.
Actually I'm not big on it but rather know that there is a great deal of science and engineering code built upon it. Of course the smarter types have moved on to C++ or even better languages. I just see offering a Fortran compiler as a low traction way to bring more technical people into the fold.