or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple throws out the rulebook for its unique next-gen Mac Pro
New Posts  All Forums:Forum Nav:

Apple throws out the rulebook for its unique next-gen Mac Pro - Page 13

post #481 of 1290
Quote:
As to software support I think you are right in that regard. People blame Apples slide in the professional market on the Mac Pro, but I'm going out on a limb here to say that there is a lot more to it than that. Apples tendency to neglect important elements of the pie, in this case OpenGL, tends to undermine acceptance. 

 

Open GL.

 

I'm glad Apple finally seems to be taking the format they support seriously with Mavericks.

 

4.1?  Full support.  4.2 in play...and 4.3 pending?

 

Whispers from around the net hint at graphical/system response being much faster.

 

Hopefully this will mean gpu performance vs PC gpu performance will finally see parity????

 

Not the 50-100% less (which always struck me as ridiculous...to say that Apple supported the only equivalent standard to Direct X on the Mac side...)

 

Do they only have one guy doing their GL drivers?

 

With Open GL 4.1, Mavericks, the New Mac Pro...and the new Pro apps from 3rd parties...

 

...far from the much feared death of the Pro...we see a renaissance?

 

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #482 of 1290
Quote:
Originally Posted by wizard69 View Post



Frankly I'm not sure what NVidias GPUs brought to the table, I wasn't exactly happy to see NVidia return to Mac hardware. The performance and power profile is so close to AMDs hardware that it makes you wonder why Apple troubled themselves.

Yes I've gathered that. I've followed both for a very long time. NVidia overall has been really aggressive in expanding their market as you pointed out. I don't see that as a bad thing. I think it was only logical that they tuned things for their own hardware. They wanted to squeeze as much performance and stability as possible out of it. I see no motivation for such a company to do the major R&D work to get an open standard off the ground. They don't sell as many, but the gaming cards tend to absorb a lot of development costs.

 

Quote:
This is true, but from a customer perspective the constant flip flopping isn't cool!

I just view it as if your software relies on CUDA, you probably stick to those years when purchasing new machines unless the software changes to accommodate similar functionality with OpenCL and AMD drivers.

 

 

Quote:
Nobody is denying this. AMD has suffered from a tools issue in the past though that has been under continual improvement. From a developers perspective though it is far better to use OpenCL on NVidia hardware than CUDA simply because you aren't married to the hardware.

I get why you're saying that. My point was more that on a typical 3-4 year replacement cycle, if the software supports CUDA today, it's unlikely to dry up within that time. If you're a developer, it's entirely different.

 

Quote:
Apparently neither AMD nor NVidia are to blame for Apples lack of attention with respect to OpenGL or OpenCL. I'm not sure if flaky is the word here, neglectful is probably a better term. Especially in the case of OpenGL they have really smudged their image with respect to the technical community. When open source Linux solutions do better or are more complete, you have issues that are pretty severe for a company Apples size.
Probably because NVidia pursued that market very aggressively. The other thing here is that it is only recently, with its southern islands GPU cores that AMD really had the ability to even support GPU compute well.
I'm still expecting good, better and best. It is hard to tell what the specs posted represent in that triptych.

 

I have mentioned that. It's annoying to me, and I'm not entirely sure whether it's an issue of growing pains, sourcing staff, or whatever else. Apple certainly has the money to hire people, but it's rarely a simple issue when it comes to bringing something that has fallen behind back up to speed. You mention recent AMD support here. We still haven't seen that on a shipping product in the wild. The demo looked cool, and I can appreciate the demanding software they ran on it. I'm still interested in the overall results when it comes to a broader set of applications and real world testing as well as the updated alignment of price to performance. It loses some of the prior functionality that I mentioned. I liked extra bays, that don't have to deal with additional hardware layers. There's no need to deal with host cards unless you want something like SAS out and no concerns about glitch port multiplier firmware . From a practical standpoint, it's likely to be less customizable than prior models. I was going to throw in a 4k output reference, but I noticed it's supported by some of the after market cards for the current model.

 

 

Quote:

 

However I still don't buy this idea that only maximum power is of interest in the marketplace. For example dual W5000s may not interest you that much but I can see many an engineer being very happy to have one on their desk. I'm still trying to fathom all of the "up to" qualifiers in Apples postings. For example do they intend to sell the same GPU with different RAM configurations or offer choices of different GPUs.

 

 

I wasn't so much saying maximum power. The W5000s are based on a low end card, lower than the 5770s. The drivers and possibly firmware are obviously different. Ram might be ECC. It's not that great of a card overall, in spite of its price and marketing. I guess it depends on where these machines start. I am fully expecting some price updates from AMD by the time these machines ship. It isn't realistic to me to assume that a mac pro is going to cram $6k worth of gpus. That would be a higher hardware cost than they have ever addressed. I'm sure there's a market, just not a sustainable one, as configurations that cost that much are mostly available through cto options from other oems. Even on the W9000s, they're still at launch pricing currently. AMD and NVidia both tend to adjust mid cycle on longer cycles rather than rebrand workstation gpus with adjusted clock rates, and workstation cards show up after gaming cards. That means these have to last at least another year, but I don't think they'll stay at $3000 retail that entire time.

 

Quote:

As to software support I think you are right in that regard. People blame Apples slide in the professional market on the Mac Pro, but I'm going out on a limb here to say that there is a lot more to it than that. Apples tendency to neglect important elements of the pie, in this case OpenGL, tends to undermine acceptance. The other thing is that Apple does a great job with XCode and the three C's, but rapidly drops support intensity for other languages important to professionals. Sure they include things like Python with their OS, but include is a far cry from optimized support. It almost seems like Python is looked down upon at Apple. However good scripting languages are important to a very wide range of professionals, so Apple needs to get on the ball so to speak. Frankly support for Fortran wouldn't hurt either. Yes it is old and crusty but Fortran support cold ease a lot of porting jobs. Obviously we are talking UNIX here so it isn't all that bad, I just think Apple needs to make something like Python a first class Mac Solution. It is just another cog in the machine for professionals but an important one.

 

Having used it for a bit now, I like Python's structure, although the way they update it is weird where you have a lot of stuff still on Python 2, and elements of Python 3 backported in the last couple releases. I know absolutely nothing about Fortran, so there's no way I could put together a valid response on that.

post #483 of 1290
Quote:
Originally Posted by hmm View Post

If we're talking about the equivalent of a pair of W5000s and a quad Xeon, it won't be very interesting.

You'd have said that about pretty much anything they made though. Even if they'd brought out a standard tower with dual Titan GPUs and the possibility of dual CPUs, they'd have higher margins so you'd have said you could get the same power cheaper and the enclosure wasn't innovative and it likely would have had issues putting Thunderbolt in so you'd have said it was disappointing. There's almost nothing they could do that would impress you.
Quote:
Originally Posted by hmm 
The W5000s are based on a low end card, lower than the 5770s.

Which card? As you've seen, it performs at double a 7770 and 5870:

http://www.tomshardware.com/reviews/workstation-graphics-card-gaming,3425-13.html

For an entry card, that's decent enough performance, especially when there's two of them. To go from a single 5870 in the max BTO config to 4x a 5870 in the entry config in 1 year (technically 3 years) would be alright.
post #484 of 1290
Quote:
Originally Posted by nht View Post

What is true for TODAY and TOMORROW is that CUDA support in pro-apps is better and more complete.  For example in Premiere Pro Adobe supports both CUDA and OpenCL but only accelerates ray tracing in CUDA.

 

Quote:
Originally Posted by wizard69 View Post

That is Adobe's problem not OpenCL's.

 

If that were all there is to it I'd actually enjoy a certain amount of schadenfeude over Adobe suffering, but the truth is that it's not THEIR problem, it's their USERS' problem.

 

Most of us don't get to choose what editing/compositing software we use and/or can't conveniently or cost-effectively switch just because one supports a certain standard and another doesn't.

 

By developing competing systems like this and then forcing users into a situation where they can't even choose one or the other, Apple, AMD, NVidia and Adobe are all screwing the very people they seek to woo.

post #485 of 1290
Quote:
Originally Posted by Marvin View Post


You'd have said that about pretty much anything they made though. Even if they'd brought out a standard tower with dual Titan GPUs and the possibility of dual CPUs, they'd have higher margins so you'd have said you could get the same power cheaper and the enclosure wasn't innovative and it likely would have had issues putting Thunderbolt in so you'd have said it was disappointing. There's almost nothing they could do that would impress you.
Which card? As you've seen, it performs at double a 7770 and 5870:
 

It gets more interesting around the W7000 level. There are two cards, but there's no reason to expect linear scaling. Much of the time I'm writing down suspicions based on what data is available. As I've said it will be more interesting to see what comes from a shipping unit. I think they only did the preview due to how long the old one has gone without a true update. I also wonder if we'll see additional scalable thunderbolt storage solutions beyond what we have today. I think part of the reason they're lacking is volume. Most boxes are marketed at both Windows and Mac customers. There aren't many Windows customers with thunderbolt. There are probably none at the workstation level. If I come across any I'll mention them. I don't think Apple mentioned whether usb3 would be included. It's not on that chipset, but I suspect Apple would test usb3 chips for the thunderbolt displays too. I don't know that it's so much about impress. It's more about what I would buy to solve alleviate current "barely getting the job done" problems.

 

 

Quote:

http://www.tomshardware.com/reviews/workstation-graphics-card-gaming,3425-13.html

For an entry card, that's decent enough performance, especially when there's two of them. To go from a single 5870 in the max BTO config to 4x a 5870 in the entry config in 1 year (technically 3 years) would be alright.

 

Huh? Your link doesn't suggest that. The W5000 is just over the 5870 in directx. It should be higher in OpenGL. I recall the 5870 being a poor OpenGL performer under windows. I would expect it to be beyond the 680mx in the imac today outside of gaming in OpenGL and double precision math. It's not so much that double precision means a lot to everyone, so much as it is that the gaming card drivers usually choke there rather than on single. Your math is taking the card performance and doubling it for 2 cards right? Otherwise it wouldn't make sense with the provided data. This is another thing where I want to see what the final implementation is like.

post #486 of 1290
Quote:
Originally Posted by Lemon Bon Bon. View Post

Open GL.

I'm glad Apple finally seems to be taking the format they support seriously with Mavericks.
I'm glad to see them doing something here. As I've said this has more to do with perception of being a viable pro platform then many of Apples hardware solutions.
Quote:
4.1?  Full support.  4.2 in play...and 4.3 pending?

Whispers from around the net hint at graphical/system response being much faster.
Ive been hearing this too. Mavericks in general appears to be a performance overhaul.
Quote:
Hopefully this will mean gpu performance vs PC gpu performance will finally see parity????
Well I'd prefer solid and glitch free to parity with PC drivers.
Quote:
Not the 50-100% less (which always struck me as ridiculous...to say that Apple supported the only equivalent standard to Direct X on the Mac side...)
By 100% I assume you mean all of the missing goodies.
Quote:
Do they only have one guy doing their GL drivers?
Good question. For a company with billions in the bank and a reputation of being a platform for graphics you would think OpenGL would be a very high priority feature.
Quote:
With Open GL 4.1, Mavericks, the New Mac Pro...and the new Pro apps from 3rd parties...

...far from the much feared death of the Pro...we see a renaissance?

Lemon Bon Bon.

Yes I see good things ahead for this platform. The new Mac Pro will hopefully be able to fill more roles than ever before and ideally attract a broader array of customers.

The biggest issue is all of these massive changes will take awhile to iron out. I just hope the teething pains aren't that bad.
post #487 of 1290
Quote:
Originally Posted by hmm View Post

Yes I've gathered that. I've followed both for a very long time. NVidia overall has been really aggressive in expanding their market as you pointed out. I don't see that as a bad thing. I think it was only logical that they tuned things for their own hardware. They wanted to squeeze as much performance and stability as possible out of it. I see no motivation for such a company to do the major R&D work to get an open standard off the ground. They don't sell as many, but the gaming cards tend to absorb a lot of development costs.

I just view it as if your software relies on CUDA, you probably stick to those years when purchasing new machines unless the software changes to accommodate similar functionality with OpenCL and AMD drivers.
The problem with the flip flopping is that it is hard on customers and even developers. Further it looks like Apple does this for some incalculable reason to avoid a long linkage to anyone company. In a nut shell the flops don't appear to be performance based, so I really don't know why we get a flop almost every other year.

The good thing for the low end guys is that this problem is effectively gone. They will have Intel integrated GPUs and that is about it. Unless of course for some reason they implement an AMD chip with integrated graphics.
Quote:

I get why you're saying that. My point was more that on a typical 3-4 year replacement cycle, if the software supports CUDA today, it's unlikely to dry up within that time. If you're a developer, it's entirely different.
It won't dry up but if in three years your choices are all AMD based then you are screwed. Speaking of which I don't believe that NVidia has a choice with respect to high performance GPU cards and aggressive marketing into specialty industries. The market for all of those low cost GPUs will dry up slowly until the high end is about all you have as a GPU manufacture. Due to this I don't see NVidias high end chips getting cheaper but rather the opposite. Limited markets and low prices won't lead to NVidia being around long.

Sometimes I see the possibility if AMD trying to crush NVidia with this partnership with Apple. However even AMD will have issues selling discrete GPUs and thus will have to try to maintain pricing on high end cards. This whole issue with respect to Mac Pro pricing is very perplexing.
Quote:

I have mentioned that. It's annoying to me, and I'm not entirely sure whether it's an issue of growing pains, sourcing staff, or whatever else. Apple certainly has the money to hire people, but it's rarely a simple issue when it comes to bringing something that has fallen behind back up to speed.
It is looking like Mavericks irons a lot of this out. I'd be embarrassed if I had to try to sell Apple hardware into a technical market as there is no software "backup" for that hardware. The state of drivers has been laughable.
Quote:
You mention recent AMD support here. We still haven't seen that on a shipping product in the wild. The demo looked cool, and I can appreciate the demanding software they ran on it. I'm still interested in the overall results when it comes to a broader set of applications and real world testing as well as the updated alignment of price to performance. It loses some of the prior functionality that I mentioned. I liked extra bays, that don't have to deal with additional hardware layers. There's no need to deal with host cards unless you want something like SAS out and no concerns about glitch port multiplier firmware . From a practical standpoint, it's likely to be less customizable than prior models. I was going to throw in a 4k output reference, but I noticed it's supported by some of the after market cards for the current model.
Until my heart is crushed by outlandish pricing is pretty bullish on this new Mac Pro. It really could be a corporate success story given MicroSofts fast decline.

As to the GPUs, if you can dig through AMDs terrible web site you should be able to find some technical details on the FirePros. The data rates the cards are capable of are limited by Display Port. AMD specifically says they can handle data rates much higher than the Display Port standard. So I'm wondering what AMD and Apple have been up to the last couple of years. I'm thinking a new video monitor with more than 4K resolution. Apple really has few details about the video capabilities of the machine.
Quote:


I wasn't so much saying maximum power. The W5000s are based on a low end card, lower than the 5770s. The drivers and possibly firmware are obviously different. Ram might be ECC. It's not that great of a card overall, in spite of its price and marketing.
This got me thinking, would AMD and Apple go to all of this trouble and then develop drivers for old architectures? It really doesn't make sense now. Which makes me wonder if AMD might debut new FirePro chips about the time this machine ships. As much as Apple has announced this machine they really haven't said much about it.
Quote:
I guess it depends on where these machines start. I am fully expecting some price updates from AMD by the time these machines ship. It isn't realistic to me to assume that a mac pro is going to cram $6k worth of gpus. That would be a higher hardware cost than they have ever addressed. I'm sure there's a market, just not a sustainable one, as configurations that cost that much are mostly available through cto options from other oems. Even on the W9000s, they're still at launch pricing currently. AMD and NVidia both tend to adjust mid cycle on longer cycles rather than rebrand workstation gpus with adjusted clock rates, and workstation cards show up after gaming cards. That means these have to last at least another year, but I don't think they'll stay at $3000 retail that entire time.
I would expect AMD and Apple to stay with the GCN based cards to minimize driver issues. This gives them a number of performance levels to choose from. The problem with these "cards" is that the cards won't be in Apples machine. So it is very difficult to judge pricing. There is an intrinsic value in the parts on the card and a profit. The problem is I'm not certain what those numbers actually are. I have a hard time believing that these $3000 cards have even $1500 worth of parts. If that. I could see the cost to Apple being close to that of an equivalent desktop card.
Quote:
Having used it for a bit now, I like Python's structure, although the way they update it is weird where you have a lot of stuff still on Python 2, and elements of Python 3 backported in the last couple releases.
I really like Python, it fits into my minor role when it comes to programming. You are right about the Python 3 transition, it could have been done better.
Quote:
I know absolutely nothing about Fortran, so there's no way I could put together a valid response on that.
Actually I'm not big on it but rather know that there is a great deal of science and engineering code built upon it. Of course the smarter types have moved on to C++ or even better languages. I just see offering a Fortran compiler as a low traction way to bring more technical people into the fold.
post #488 of 1290
Quote:
Originally Posted by v5v View Post


If that were all there is to it I'd actually enjoy a certain amount of schadenfeude over Adobe suffering, but the truth is that it's not THEIR problem, it's their USERS' problem.
Well it would be Adobes problem if users rebelled a bit. Adobe has made more than a few bad business decisions of late and as such they have yet to be significantly harmed by their users. That is actually pretty odd.
Quote:
Most of us don't get to choose what editing/compositing software we use and/or can't conveniently or cost-effectively switch just because one supports a certain standard and another doesn't.
Believe me I understand this to an extent. However at least you have options that compete with various Adobe products.
Quote:
By developing competing systems like this and then forcing users into a situation where they can't even choose one or the other, Apple, AMD, NVidia and Adobe are all screwing the very people they seek to woo.

Again it is a problem that the users need to place back on the developers. All that is needed is a conserted effort to make sure that they hear about the different issues associated with their policies. If no one stands up and says - hey adobe your subscription plan sucks - or - we want OpenCL support - adobe won't do much to change.
post #489 of 1290
Quote:
Originally Posted by wizard69 View Post

Well I'd prefer solid and glitch free to parity with PC drivers.

Me too. Just last night I had to reboot my PC twice due to video driver bringing down the system while gaming. All the optimizations on the PC are nice but they come in the form of tricky code which is harder to get all the kinks out.

 

BTW has anyone tried any trip-A titles on Mavericks (e.g. Batman, Deus Ex on Mac App Store). Better frame rates?

post #490 of 1290
Quote:
Originally Posted by wizard69 View Post



Again it is a problem that the users need to place back on the developers. All that is needed is a conserted effort to make sure that they hear about the different issues associated with their policies. If no one stands up and says - hey adobe your subscription plan sucks - or - we want OpenCL support - adobe won't do much to change.

I think they went with whatever was available and stable at the time of development. If I remember correctly, the raytracer in After Effects uses technology that is published by NVidia. NVidia also owns the remnants of Mental Images and just a lot of graphics and rendering related technology. I don't think it was a move they made for no reason. Adobe has added OpenCL support in some applications, and they will probably continue. As to the subscription thing, it's still cheaper than Autodesk or the Foundry. Both charge an initial license fee + maintenance. Maintenance alone can be quite expensive depending on the application. I would argue that Adobe's pricing isn't that high compared to some of the others. It's not that I like the subscription thing. It's more expensive for me. I'm just saying they aren't anywhere near the worst. Apple tends to sink the cost of software as it's often used more to market hardware than as a primary source of profits. That is a major distinction. Apple also doesn't make as many updates to their more specialized software products.

 

Quote:
Originally Posted by wizard69 View Post


The problem with the flip flopping is that it is hard on customers and even developers. Further it looks like Apple does this for some incalculable reason to avoid a long linkage to anyone company. In a nut shell the flops don't appear to be performance based, so I really don't know why we get a flop almost every other year.

 

I'm not sure. They seem to alternate quite a bit. I don't really have much insight to add.

 

Quote:

The good thing for the low end guys is that this problem is effectively gone. They will have Intel integrated GPUs and that is about it. Unless of course for some reason they implement an AMD chip with integrated graphics.
It won't dry up but if in three years your choices are all AMD based then you are screwed. Speaking of which I don't believe that NVidia has a choice with respect to high performance GPU cards and aggressive marketing into specialty industries. The market for all of those low cost GPUs will dry up slowly until the high end is about all you have as a GPU manufacture. Due to this I don't see NVidias high end chips getting cheaper but rather the opposite. Limited markets and low prices won't lead to NVidia being around long.

 

Well NVidia does support OpenCL on their current cards. I would have to look up the state of OpenCL 1.2 support, but I'm pretty sure it's in place for currently supported cards. On the OSX end, a lot of developers are drifting toward OpenCL over the longer term, but I wouldn't feel uncomfortable being set up with something that could use CUDA today. It's not a huge deal to me as while I use that functionality in a couple programs, I don't use it that frequently on my own machine. I have a CS6 license, so I have Premiere. I don't use it anywhere near daily. I will probably transition to their subscription model eventually, but there's nothing pressing at the moment. If anything I'm likely to spend less time with it a year from now. Treating licenses as perpetual works better on Windows anyway. With OSX something will break it as it changes more frequently. Things that annoy me with Adobe are more like last minute change notices. Late in CS5, they announced CS6 upgrade eligibility would cut off at CS5 rather than the old rule of three versions back. It's stupid to announce such a change nearing the end of an upgrade cycle. They backed off considerably as I anticipated, but it was still a stupid thing to do.

 

 

 

Quote:
Sometimes I see the possibility if AMD trying to crush NVidia with this partnership with Apple. However even AMD will have issues selling discrete GPUs and thus will have to try to maintain pricing on high end cards. This whole issue with respect to Mac Pro pricing is very perplexing.
It is looking like Mavericks irons a lot of this out. I'd be embarrassed if I had to try to sell Apple hardware into a technical market as there is no software "backup" for that hardware. The state of drivers has been laughable.

 

Statements like that are why I think you really hate NVidia. They have their long term bets. What you're describing is something I mentioned before. The gaming cards essentially subsidize Tesla and Quadro cards. Selling $100 gaming cards means terrible margins, but if they ship enough they can absorb fab costs. I suspect they have some other plan for that going forward. I agree with you on drivers. I see something like dual gpus optimized for computation as something that would appeal to scientific markets. Heavier workstation sales have been somewhat flat depending on what quarter you examine, but if you take something that used to require cluster time and move it down to a single user device that requires minimal IT support, that should move units. Obviously I'm talking about the low end of that market and people who merely use time on such a cluster, not tasks where an entire cluster is dedicated.

 

Quote:
Until my heart is crushed by outlandish pricing is pretty bullish on this new Mac Pro. It really could be a corporate success story given MicroSofts fast decline.

Microsoft just keeps missing every turn. Prior to Apple in the phone market, they should have been worried about Blackberry. In both cases they showed up late without something truly interesting. Look at how much money they had to throw at the Xbox just to catch up to Sony. If Sony hadn't weakened overall, I'm not sure MS would have gained as much leverage. They make a lot of stuff that works well enough, and they aren't really doing that bad. It's just they miss potential markets and throw products out just to have a presence. Even with that, Windows doesn't bother me as much as some of you guys.


 

 

Quote:
As to the GPUs, if you can dig through AMDs terrible web site you should be able to find some technical details on the FirePros. The data rates the cards are capable of are limited by Display Port. AMD specifically says they can handle data rates much higher than the Display Port standard. So I'm wondering what AMD and Apple have been up to the last couple of years. I'm thinking a new video monitor with more than 4K resolution. Apple really has few details about the video capabilities of the machine.
This got me thinking, would AMD and Apple go to all of this trouble and then develop drivers for old architectures? It really doesn't make sense now. Which makes me wonder if AMD might debut new FirePro chips about the time this machine ships. As much as Apple has announced this machine they really haven't said much about it.

I am not sure. They don't seem to have anything significant coming out in terms of chip architecture that I can find. I haven't followed too closely. I don't see anything wrong with 4K. It's a lot of resolution. As to developing drivers, AMD 7000 chip drivers have shown up in prior Mountain Lion builds. AI and MR both published articles on that something like a year ago. The low level drivers may have been in place for some time.

 

 

Quote:

I would expect AMD and Apple to stay with the GCN based cards to minimize driver issues. This gives them a number of performance levels to choose from. The problem with these "cards" is that the cards won't be in Apples machine. So it is very difficult to judge pricing. There is an intrinsic value in the parts on the card and a profit. The problem is I'm not certain what those numbers actually are. I have a hard time believing that these $3000 cards have even $1500 worth of parts. If that. I could see the cost to Apple being close to that of an equivalent desktop card.

Much of the time they're extremely similar in hardware when compared to the gaming cards. I would need to compare specs, but the 7970 should be close to the W9000 in terms of raw specs. Workstation cards aren't always faster in every way. They're generally more stable in OpenGL apps. They often hold up better in double precision calculations. At the high end you can find them with more ram. NVidia debuted iray back in 2009 or 2010. At that time a tesla card was the only way you could possibly render a scene that contained more than a couple objects, especially if it contained any "hero" geometry as in maximum level of detail and high resolution mapped textures. I suspect the ram is a big thing with other types of computation too.

 

Quote:

I really like Python, it fits into my minor role when it comes to programming. You are right about the Python 3 transition, it could have been done better.
Actually I'm not big on it but rather know that there is a great deal of science and engineering code built upon it. Of course the smarter types have moved on to C++ or even better languages. I just see offering a Fortran compiler as a low traction way to bring more technical people into the fold.

 

 

Yeah I found out about certain changes in mutability and some other things when scripts wouldn't run. Obviously that filled me with rage. I do most of it in a text editor. I only use an IDE if I want something like pop up flags, and even then I usually use embedded ones in certain software, as they auto load software specific semantics. I wish I could comment on Fortran, but I try to avoid being a wikipedia scholar.

post #491 of 1290
Quote:
Originally Posted by hmm View Post

 

Microsoft just keeps missing every turn. Prior to Apple in the phone market, they should have been worried about Blackberry. In both cases they showed up late without something truly interesting. Look at how much money they had to throw at the Xbox just to catch up to Sony. If Sony hadn't weakened overall, I'm not sure MS would have gained as much leverage. They make a lot of stuff that works well enough, and they aren't really doing that bad. It's just they miss potential markets and throw products out just to have a presence. Even with that, Windows doesn't bother me as much as some of you guys.

They thought the PC was going to migrate in to the living room but it migrated to the phone. A lot of money spent building a competitor in the TV gaming space for nothing (strategically speaking).

post #492 of 1290
Quote:
Originally Posted by hmm View Post

As to the subscription thing, it's still cheaper than Autodesk or the Foundry. Both charge an initial license fee + maintenance. Maintenance alone can be quite expensive depending on the application. I would argue that Adobe's pricing isn't that high compared to some of the others. It's not that I like the subscription thing. It's more expensive for me.

 

But if you do not pay maintenance, the app still works. Maintenance covers upgrades & customer support for each year. For The Foundry, you are basically buying the app again every 5 years, cost-wise. And if you are using the apps to make a living, then that is just the cost of doing business (and a tax deduction).

Late 2009 Unibody MacBook (modified)
2.26GHz Core 2 Duo CPU/8GB RAM/60GB SSD/500GB HDD
SuperDrive delete
Reply
Late 2009 Unibody MacBook (modified)
2.26GHz Core 2 Duo CPU/8GB RAM/60GB SSD/500GB HDD
SuperDrive delete
Reply
post #493 of 1290
Quote:
Originally Posted by ascii View Post

They thought the PC was going to migrate in to the living room but it migrated to the phone. A lot of money spent building a competitor in the TV gaming space for nothing (strategically speaking).

I never personally thought that. I mean what they did with the Xbox isn't bad. It should have been obvious things would go to the phone as far back as the Handspring Treo, speaking of which Microsoft had earlier phone products. They just didn't pursue that area hard enough or utilize it to its potential.

 

Quote:
Originally Posted by MacRonin View Post

 

But if you do not pay maintenance, the app still works. Maintenance covers upgrades & customer support for each year. For The Foundry, you are basically buying the app again every 5 years, cost-wise. And if you are using the apps to make a living, then that is just the cost of doing business (and a tax deduction).

 

I do know what it covers. You can deduct the cost of Adobe's rental model too without amortization. I really don't like what Adobe did, but I think the complaints are still somewhat overstated. If adoption lags too much, they will offer better rates.

post #494 of 1290
Quote:
Originally Posted by hmm View Post

There are two cards, but there's no reason to expect linear scaling.

There is for some tasks but not all tasks. Metro 2033 scales nearly perfectly across two GPUs:

http://www.hardware.fr/articles/848-22/amd-radeon-hd-7970-crossfirex-test-28nm-gcn.html

as do some compute tasks:

http://www.tomshardware.com/reviews/geforce-gtx-690-benchmark,3193-12.html
Quote:
Originally Posted by hmm View Post

I don't think Apple mentioned whether usb3 would be included. It's not on that chipset, but I suspect Apple would test usb3 chips for the thunderbolt displays too.

It says USB 3 on the page:

http://www.apple.com/mac-pro/
Quote:
Originally Posted by hmm View Post

Huh? Your link doesn't suggest that. The W5000 is just over the 5870 in directx.

That's on the low preset, which won't be using advanced shaders. If you look at the high preset, the 5870 is 10FPS and the W5000 is 22. Metro 2033, the 5870 is 5.5, the W5000 is 15. Some of the less intensive games, they come out fairly even though and I see the cumulative performance puts the W5000 only slightly ahead of the 5870 so Apple's setup would be closer to 2x 5870. Still quite a long way from being below a 5770, which is half the performance of a 5870.

You can also see that the cumulative performance is around half of a 7970 or 680 so for things that do scale linearly, Apple's entry Mac Pro would come with that equivalent performance, which isn't really bad at all.
post #495 of 1290
Quote:
Originally Posted by wizard69 View Post

Well it would be Adobes problem if users rebelled a bit. Adobe has made more than a few bad business decisions of late and as such they have yet to be significantly harmed by their users. That is actually pretty odd.

 

Once an app becomes an "industry standard" it's really difficult for someone who makes their living using it to switch to something else. I'm not at all surprised that people have not jumped ship.

 

 


Quote:
Originally Posted by wizard69 View Post

at least you have options that compete with various Adobe products.

 

I could be wrong, but I don't think the alternatives are viable. Based on my admittedly cursory examination, other titles seem to be either not intended for pro users and lack the depth of features such work requires, or are specialty titles used mostly by Hollywood that cost as much as the computer that runs them.

 

My complaint isn't really with any of the specific players anyway. Like so many others in so many situations, I'm just frustrated by the lack of a single, universal standard upon which all can agree.

 

If I'm correctly understanding what I'm reading in this thread, NVidia and AMD each use different approaches to hardware acceleration and certain software titles are optimized for one or the other. Since the software I use (After Effects) uses the NVidia approach, and Apple doesn't offer NVidia hardware, my options are to change software, which is impractical, or change computer supplier, which means using Windows instead of OSX. Neither is a particularly desirable solution.

 

If it's true that Adobe's existing support for CUDA over OpenC/GL is simply a matter of what made sense at the time and will soon evolve to include the latter, then the issue would seem to be moot.

post #496 of 1290
Quote:
Originally Posted by v5v View Post

Once an app becomes an "industry standard" it's really difficult for someone who makes their living using it to switch to something else. I'm not at all surprised that people have not jumped ship.
Industry standard is often used to express the concept that we don't want to try anything new. I've seen this in automation hardware and software also for various IT groups that refuse to modernize anything.
Quote:


I could be wrong, but I don't think the alternatives are viable. Based on my admittedly cursory examination, other titles seem to be either not intended for pro users and lack the depth of features such work requires, or are specialty titles used mostly by Hollywood that cost as much as the computer that runs them.
Wel of you need a specific feature you are pretty much screwed. However if you don't then you do have a lot of alternatives to choose from.
Quote:
My complaint isn't really with any of the specific players anyway. Like so many others in so many situations, I'm just frustrated by the lack of a single, universal standard upon which all can agree.
Actually the last thing you want in software is universal standards. MicroSoft Office is an example of this both good and bad. The problem with universal standards is that they often become bloated with options to try to please everybody. Sometime this isn't too bad, Adobe so far hasn't completely ruined their apps with bloat, MS on the other hand has damaged Office pretty badly trying to be all things to all people.

In a nut shell universal standards lead to stagnation and a lack of innovation
Quote:
If I'm correctly understanding what I'm reading in this thread, NVidia and AMD each use different approaches to hardware acceleration and certain software titles are optimized for one or the other. Since the software I use (After Effects) uses the NVidia approach, and Apple doesn't offer NVidia hardware, my options are to change software, which is impractical, or change computer supplier, which means using Windows instead of OSX. Neither is a particularly desirable solution.
You should alway make your desires known to the likes of Adobe and other software developers. In simplest terms no one wants to be forced to use the wrong hardware. However don't expect miracles, even if Adobe went all in with OpenCL acceleration it could take years to completely port everything involved.
Quote:
If it's true that Adobe's existing support for CUDA over OpenC/GL is simply a matter of what made sense at the time and will soon evolve to include the latter, then the issue would seem to be moot.
As far as I know Adobe hasn't publicly expressed their intentions with respect to OpenCL. However from a practical standpoint it would be far easier to move to OpenCL to allow for a broader hardware coverage. This isn't just an AMD / NVidia thing, tablets and cell phones run completely different hardware. So if Adobe wants to see their software running on alternative platforms then they need to have as much portable code as possible. In this regards I'm expecting OpenCL compliant hardware in mobile devices and support for that hardware software wise.

By the way, Adobe is in a position where they can pick an choose which platforms to run on. The big difference here is the leverage smaller developers get by going with OpenCL. OpenCL would save such developers a lot of time when it comes to supporting alternative architectures.
post #497 of 1290
Quote:
Originally Posted by Marvin View Post


There is for some tasks but not all tasks. Metro 2033 scales nearly perfectly across two GPUs:

http://www.hardware.fr/articles/848-22/amd-radeon-hd-7970-crossfirex-test-28nm-gcn.html

as do some compute tasks:

http://www.tomshardware.com/reviews/geforce-gtx-690-benchmark,3193-12.html

Thanks. I was unaware of that.

 

Quote:

 

It says USB 3 on the page:

Again I missed that. I'm not aware of any native support from intel there, but they can probably reuse some of that work for the thunderbolt display updates.

 

 

Quote:
http://www.apple.com/mac-pro/
That's on the low preset, which won't be using advanced shaders. If you look at the high preset, the 5870 is 10FPS and the W5000 is 22. Metro 2033, the 5870 is 5.5, the W5000 is 15. Some of the less intensive games, they come out fairly even though and I see the cumulative performance puts the W5000 only slightly ahead of the 5870 so Apple's setup would be closer to 2x 5870. Still quite a long way from being below a 5770, which is half the performance of a 5870.

 

I just looked through each chart. Keep in mind I hate using gaming benchmarks. Typically the prime function of these things is high precision OpenGL used in 3d apps, engineering applications, and some scientific work. That aside the difference you speak of is mainly seen on ultra at 2560x1440. High at 1080 shows a much lower percentage of deviation. I suspect that it's a 1GB 5870 and those settings. Some of the higher NVidia cards use 1.5GB on typical configurations. Looking at it in context, it's still way below where even the imac would be on that chart. The 680mx should be around a 660 desktop card, maybe a bit higher or lower. It's not going to be an extreme deviation. Likewise the 7970 is close to double. Assuming strong scaling it would be around the same with 2x W5000s at a significantly higher cost, again assuming PC side pricing provides a good pricing reference. The best comparison I could find was on Tom's hardware. They tested enough different scenarios. As you can see it's all over the place, so in the end it depends on software support. Apple really needs to provide this support in the form of libraries rather than rely on developers to adjust for whatever cards they choose that year. That always leads to delayed certification when hardware and software releases are poorly aligned. On OSX I'm not sure you'll see as much driver differentiation. The biggest thing in using workstation cards there for computation is the sheer amount of memory. At the top levels you can't get that on a radeon card. It's the same reason NVidia packed the Teslas with ram. In many cases Fermi gaming cards did well with computation assuming the problem was small enough to fit within the card's memory.

 

http://www.tomshardware.com/reviews/firepro-w8000-w9000-benchmark,3265.html

 

 

From the last page of that review

 

 

Quote:

A direct comparison between AMD’s FirePro W-series and Nvidia’s Quadro family is difficult. One somehow manages to shine when the other runs into major problems. The tasks these cards are built to perform are varied and complex, leaving us to look at benchmarks for specific tasks, which keep us from generalizing about performance overall.

TSMC's 28 nm process technology, AMD's GCN architecture, and the hardware-accelerated features yet to be enabled by ISVs are all very promising. However, AMD's drivers don't necessarily seem mature enough to gauge what these cards will do in the weeks and months to follow. But potential is there, and once AMD’s software team is done closing some of the gaps seen in today's testing, the new FirePro cards could become well-priced alternatives to Nvidia's more established Quadro line-up. We’re also looking forward to the lower-end offerings in AMD’s FirePro W-series, and we’ll try to get our hands on them as soon as they're available.

If you flip through it, you'll see each card ran into issues. A couple wouldn't run on whatever hardware. I'm just not big on comparing to an obviously outdated card today. The imac pulled ahead of the 5870 with the 680mx. The 5870 came out in late 2009. It's not quite 4 years old at this point, but I still consider it a poor point of reference. At this point I would expect better performance or something more cost effective. Given that the overall design doesn't accommodate a lot of extra storage internally or anything in the way of after market parts, I would expect them to provide a good value in terms of performance even on the entry to mid level models. In my opinion the 4,1 and 5,1 were very poor values overall. They could have done better, and it doesn't come down solely to specs. There's a lot of discussion of better framework tuning and support in Mavericks, and that is part of providing a viable platform.

 

Quote:

You can also see that the cumulative performance is around half of a 7970 or 680 so for things that do scale linearly, Apple's entry Mac Pro would come with that equivalent performance, which isn't really bad at all.


Yeah I noticed that. I guess the extra ram would be good. I try to compare against Windows benchmarks, but it can be difficult as OSX versions can be all over the place relative to Windows and even Linux.

post #498 of 1290
Quote:
Originally Posted by hmm View Post

Thanks. I was unaware of that.

Again I missed that. I'm not aware of any native support from intel there, but they can probably reuse some of that work for the thunderbolt display updates.
Most likely Intel will be launching a new chipset around the time this hardware ships. The "motherboard" card is rather small so I can't imagine Apple wasting space on PCI to USB3 bridge chips. Another possibility is that they used some glue logic to link a desktop north bridge. Then again Intel and Apple could have partnered on an Intel custom North Bridge.

I would lean towards intel shipping new hardware myself. A custom chip for Apple seems to be unlikely. Further the Xeon environment could really use an update that supports USB 3 generally.
Quote:


I just looked through each chart. Keep in mind I hate using gaming benchmarks. Typically the prime function of these things is high precision OpenGL used in 3d apps, engineering applications, and some scientific work. That aside the difference you speak of is mainly seen on ultra at 2560x1440. High at 1080 shows a much lower percentage of deviation. I suspect that it's a 1GB 5870 and those settings. Some of the higher NVidia cards use 1.5GB on typical configurations. Looking at it in context, it's still way below where even the imac would be on that chart. The 680mx should be around a 660 desktop card, maybe a bit higher or lower. It's not going to be an extreme deviation. Likewise the 7970 is close to double. Assuming strong scaling it would be around the same with 2x W5000s at a significantly higher cost, again assuming PC side pricing provides a good pricing reference. The best comparison I could find was on Tom's hardware. They tested enough different scenarios. As you can see it's all over the place, so in the end it depends on software support. Apple really needs to provide this support in the form of libraries rather than rely on developers to adjust for whatever cards they choose that year.
Apple is a very strong advocate for the use of libraries (Accelerate). They strongly reccomend this as opposed to using the vector capability of a processor. The problem is I don't believe that Accelerate makes use of OpenCL or GPUs in other ways. It would be interesting to see Accelerate transparently use GPUs. However the problem with GPU compute is that the programmer often has to know about the GPUs capabilities to really benefit from it.

So the question is, is it even possible to write a worthwhile generic library for a GPU? We have certainly see such for high level languages like Python, but how much do you loose performance wise?
Quote:
That always leads to delayed certification when hardware and software releases are poorly aligned. On OSX I'm not sure you'll see as much driver differentiation. The biggest thing in using workstation cards there for computation is the sheer amount of memory. At the top levels you can't get that on a radeon card. It's the same reason NVidia packed the Teslas with ram. In many cases Fermi gaming cards did well with computation assuming the problem was small enough to fit within the card's memory.
RAM and GPU compute is a difficult subject to get a handle on. I wouldn't be surprised to find out that one reason Apple went with AMD is that they seem to be of the same mind as to how GPU compute should work in the future. That elusive goal of heterogeneous computing. I'm not sure how AMD will support this on cards sitting on the PCI Express bus though.

Depending upon the app though simply moving data in and out of RAM on the GPU card becomes a bottle neck. Also as you note GPU RAM can but limits on what the GPU can realistically do. The Mac Pro throws another wrench into the works by spreading GPU RAM across two cards.

It will be very interesting to see how this all works out. I wouldn't be surprised to find out that getting drivers up to snuff has delayed this ne Mac Pro more than hardware issues.
Quote:



From the last page of that review


If you flip through it, you'll see each card ran into issues. A couple wouldn't run on whatever hardware. I'm just not big on comparing to an obviously outdated card today.
This is why I tend to reject the rantings of people that pop up one URL to declare their GPU the absolute best. The reality is that the unique architectures involved mean that results will be all over the map. A programmer really only has two choices, one is to live with what you have. The other is to test on all platforms and go with what works best.

The problem with the second approach is that there are far to many rapidly changing variables to contend with. Drive updates, compiler updates, library updates and updates to the physical architectures all impact how a GPU will perform with your code. This is why I see all the posts about the desires for NVidia GPUs to be unwarranted. Things change rapidly in this industry, for example NVidia use to suck at double precision.

By the way if you are stipuck on NVidia because of software being tied to it you need to either buy the specific hardware or complain loudly to NVidia. From the standpoint of a developer it isn't smart to be tied so closely to hardware anyways.
Quote:
The imac pulled ahead of the 5870 with the 680mx. The 5870 came out in late 2009. It's not quite 4 years old at this point, but I still consider it a poor point of reference. At this point I would expect better performance or something more cost effective. Given that the overall design doesn't accommodate a lot of extra storage internally or anything in the way of after market parts, I would expect them to provide a good value in terms of performance even on the entry to mid level models. In my opinion the 4,1 and 5,1 were very poor values overall. They could have done better, and it doesn't come down solely to specs. There's a lot of discussion of better framework tuning and support in Mavericks, and that is part of providing a viable platform.
Mavericks seems to be a bigger update than Apple let's on. Maybe they want to surprise everybody.
Quote:

Yeah I noticed that. I guess the extra ram would be good. I try to compare against Windows benchmarks, but it can be difficult as OSX versions can be all over the place relative to Windows and even Linux.

OSX and GPU support have suffered for a long time from neglect. I'm really hoping that Mavericks lives up to its billing as various web posts imply.
post #499 of 1290
Quote:
Originally Posted by wizard69 View Post


Most likely Intel will be launching a new chipset around the time this hardware ships. The "motherboard" card is rather small so I can't imagine Apple wasting space on PCI to USB3 bridge chips. Another possibility is that they used some glue logic to link a desktop north bridge. Then again Intel and Apple could have partnered on an Intel custom North Bridge.

New computers still have a real northbridge?


 

Quote:

 

 

I would lean towards intel shipping new hardware myself. A custom chip for Apple seems to be unlikely. Further the Xeon environment could really use an update that supports USB 3 generally.

Every other oem shipped it as a discrete component. My point was that they would have to test third party ones anyway if an updated thunderbolt display is to include it.

 

Doh forum chopped my post again and I'm not rewriting all of that. That was the one time I forgot to copy before submitting.

post #500 of 1290
I still wonder If the Mac mini(even Apple TV) get this design look still obviously smaller.
post #501 of 1290
Quote:
Originally Posted by Curtis Hannah View Post

I still wonder If the Mac mini(even Apple TV) get this design look still obviously smaller.

There's no reason to. The Mac Pro uses a triangular heatsink because it has to cool 3 high power chips (about 650W combined) and house a very large power supply. The Mini has 1 low power chip (45W) and a very small power supply. The Apple TV is passively cooled (~2W) so doesn't even need a fan.

If the Mini changes to PCIe SSD for one of the drives, I could see it getting shorter but there's no reason for it to be cylindrical.
post #502 of 1290
Quote:
Quote:

Yeah I noticed that. I guess the extra ram would be good. I try to compare against Windows benchmarks, but it can be difficult as OSX versions can be all over the place relative to Windows and even Linux.

OSX and GPU support have suffered for a long time from neglect. I'm really hoping that Mavericks lives up to its billing as various web posts imply.

 

Neglect.  Wasn't it Jobs that once said that Apple lost sight of their crown jewels once upon a time?  

 

I hope Mavericks reverses this apparent trend.  GPU support and Open GL has seemed weak to me over the years compared to Windows GPUs and drivers be they GL or Direct X.  Certainly a well rounded on perception of 'lagging' GL development by many, many commentators and critics...

 

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #503 of 1290

Mavericks looks to be shaping up to being a 'Snow Leopard' style release?

 

Lemon Bon Bon.

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply

You know, for a company that specializes in the video-graphics market, you'd think that they would offer top-of-the-line GPUs...

 

WITH THE NEW MAC PRO THEY FINALLY DID!  (But you bend over for it.)

Reply
post #504 of 1290
Quote:
Originally Posted by Lemon Bon Bon. View Post

Mavericks looks to be shaping up to being a 'Snow Leopard' style release?

 

Lemon Bon Bon.


That would be great. Snow Leopard had its problems early on, but it shaped up nicely. Apart from the slipping OpenGL that really started prior to that, it was probably the best overall release of OSX. It's not practical to stay with forever, simply because that locks you in time in terms of features and framework updates. I definitely tend to stay longer with really stable releases whenever possible, just not typically more than a couple versions back.

post #505 of 1290
Quote:
Originally Posted by Marvin View Post

If the Mini changes to PCIe SSD for one of the drives, I could see it getting shorter but there's no reason for it to be cylindrical.

 

My fear is that the mini will adopt the form factor of the new Airports. Ick.

post #506 of 1290

The MM will always stay the same shape because people are accustomed to this form already.
 

post #507 of 1290
Quote:
Originally Posted by marvfox View Post

The MM will always stay the same shape because people are accustomed to this form already.

 

Wishful thinking. Another generation or two of process shrinks and the Mini itself will shrink again. It is almost a certainty
post #508 of 1290
Strange because the latest release really outclasses everything before it. I really like Mountain Lion. It works well on my old hardware.
Quote:
Originally Posted by hmm View Post


That would be great. Snow Leopard had its problems early on, but it shaped up nicely. Apart from the slipping OpenGL that really started prior to that, it was probably the best overall release of OSX. It's not practical to stay with forever, simply because that locks you in time in terms of features and framework updates. I definitely tend to stay longer with really stable releases whenever possible, just not typically more than a couple versions back.
post #509 of 1290
Quote:
Originally Posted by wizard69 View Post

Strange because the latest release really outclasses everything before it. I really like Mountain Lion. It works well on my old hardware.


Due to a couple applications that I don't feel like upgrading at the moment, I'm not on Mountain Lion. I plan to change software packages on a couple things in the near future.

 

 

Quote:
Originally Posted by wizard69 View Post


Wishful thinking. Another generation or two of process shrinks and the Mini itself will shrink again. It is almost a certainty

SSD capacities may be an additional factor. For all the people who say to plug in additional storage (necessary on backups anyway), I would prefer a small box that included storage to two small boxes. It should have enough internally for the bottom 80% or so of users who purchase Minis.

post #510 of 1290
Quote:
Originally Posted by hmm View Post


Due to a couple applications that I don't feel like upgrading at the moment, I'm not on Mountain Lion. I plan to change software packages on a couple things in the near future.
Sounds like you might want to wait for Mavericks. I will probably jump into Mavericks right after it comes out, that even though a major release like this can be troublesome.
Quote:

SSD capacities may be an additional factor. For all the people who say to plug in additional storage (necessary on backups anyway), I would prefer a small box that included storage to two small boxes. It should have enough internally for the bottom 80% or so of users who purchase Minis.

I could see Apple saying "screw it" with respect to magnetic drives in the Mini that would allow for a thinner machine. The industry is right on the edge of a permanent transition for base machines. Well at least Apple is, the AIRs already pack a fairly healthy SSD for the price of the machine. Ideally they would implement two slots for the SSDs.

I can't really say that I'd be happy with this but that is why i was so hot in the XMac concept. Easy access drive bays being one reason for the XMac. Which brings up an thought, pull one of the video cards from the New Mac Pro and you have room for a magnetic drive, possibly two. In many ways the new Mac Pro comes really close too providing the capabilities I was hoping for in an XMac.

To take a side step here, over on MS Channel 9 site there is a really good video by Eric Brumer covering performance computing and the importance of memory bandwidth. The video was very approachable even by people not obsessed with performance. Unfortunately I don't have the title here off the top of my head. The gist is that extraordinarily simple code adjustments can have a huge impact of performance. This can be the case with either CPUs or GPUs.

Edit:

I found the title of that video: Native Code Performance and Memory: The Elephant in the CPU. The URL is: http://channel9.msdn.com/Events/Build/2013/4-329. Highly recommended.
Edited by wizard69 - 7/1/13 at 4:18pm
post #511 of 1290
Quote:
Originally Posted by v5v View Post

My fear is that the mini will adopt the form factor of the new Airports. Ick.

What is wrong with the new Airports? If you look at the mechanical layout they are almost a Mini Mac Pro. The hardware is an interesting approach to solving common problem with hardware installations.
post #512 of 1290
I guess my problem is that Apple has had the resources to do Mac OS right for a long time now. They haven't apparently and thus the comment about neglect.

I do hope that Mavericks is that turnaround we are all hoping for. I've been working my way through the WWDC developer videos and frankly it does look like a major overhaul. Even AppleScript is getting a major update which for a long time appeared to be dead technology to me. So yeah I'm very hopefully that Mavericks shores up the OS and eliminates the nagging weak points.

Quote:
Originally Posted by Lemon Bon Bon. View Post

Neglect.  Wasn't it Jobs that once said that Apple lost sight of their crown jewels once upon a time?  

I hope Mavericks reverses this apparent trend.  GPU support and Open GL has seemed weak to me over the years compared to Windows GPUs and drivers be they GL or Direct X.  Certainly a well rounded on perception of 'lagging' GL development by many, many commentators and critics...

Lemon Bon Bon.
post #513 of 1290
Quote:
Originally Posted by wizard69 View Post


Sounds like you might want to wait for Mavericks. I will probably jump into Mavericks right after it comes out, that even though a major release like this can be troublesome.
 

I plan to wait until it stabilizes. It's not like they're all major applications. I'm working on an alternative for Quickbooks as the Mac version is atrocious. The others are just basic utility stuff. It could be new mac pro + mavericks + migrate apps once everything looks somewhat stable. At this point my notebook is faster than the old mac pro. It's not great. Having used newer Mac Pros as well, I can say they offer a smoother experience. Notebook is sandy bridge 15" 2.3, so on the faster side of notebooks compared to personal experience with the hex mac core pros. The extra memory might be part of it. I also don't have to listen to fans blast on one of those, but I don't own one.

 

 

Quote:
I could see Apple saying "screw it" with respect to magnetic drives in the Mini that would allow for a thinner machine. The industry is right on the edge of a permanent transition for base machines. Well at least Apple is, the AIRs already pack a fairly healthy SSD for the price of the machine. Ideally they would implement two slots for the SSDs.

Ideally to me would be back to non-proprietary formats. The mini is in some cases used as a server, so they may not wish to drop below a certain available storage level there.


 

 

Quote:
I can't really say that I'd be happy with this but that is why i was so hot in the XMac concept. Easy access drive bays being one reason for the XMac. Which brings up an thought, pull one of the video cards from the New Mac Pro and you have room for a magnetic drive, possibly two. In many ways the new Mac Pro comes really close too providing the capabilities I was hoping for in an XMac.

I thought it was a cool concept. Some of the harsher critics on here may have not stumbled on the earlier threads where you described it in full detail. I prefer internal because a lot of the cheap storage solutions are terrible. USB3 and thunderbolt alleviate the need for things like eSATA host cards, which is good. It's one less after market hardware layer. You still have the firmware and drivers of a port multiplier box in there as opposed to something supported at the chipset level. I've used a pretty wide range of hardware in this area, so I'm speaking from experience. When people just link random things they found on newegg, I'm not sure they realize that not all of them work as stated. Lion and Mountain Lion also saw the end of OSX development with some of the older chipsets. Especially with eSATA stuff nothing with a Silicon Image chipset is officially supported. Some of them can be made to work, but they aren't supported. I warned at least one person about that when they were buying a das solution around the time Lion debuted.

 

Quote:

To take a side step here, over on MS Channel 9 site there is a really good video by Eric Brumer covering performance computing and the importance of memory bandwidth. The video was very approachable even by people not obsessed with performance. Unfortunately I don't have the title here off the top of my head. The gist is that extraordinarily simple code adjustments can have a huge impact of performance. This can be the case with either CPUs or GPUs.

Edit:

I found the title of that video: Native Code Performance and Memory: The Elephant in the CPU. The URL is: http://channel9.msdn.com/Events/Build/2013/4-329. Highly recommended.

I'm watching it at the moment. He's talking about cache hits.

 

 

 

Quote:
Originally Posted by wizard69 View Post


What is wrong with the new Airports? If you look at the mechanical layout they are almost a Mini Mac Pro. The hardware is an interesting approach to solving common problem with hardware installations.


I never placed a real priority on aesthetics. Non-Apple displays are often ugly, yet I don't care. I only care how the displayed image looks, not the case. The exception would be if it was distracting. That would be annoying.

post #514 of 1290
Quote:
Originally Posted by marvfox View Post

The MM will always stay the same shape because people are accustomed to this form already.
 

 

 

And what about the old cheese-grater Mac Pro?  The "pros" out there tend to be far less adaptive/receptive to change than the ordinary Joe, as evidenced by the incessant bashing of the new MP design by said "pros".

 

Your = the possessive of you, as in, "Your name is Tom, right?" or "What is your name?"

 

You're = a contraction of YOU + ARE as in, "You are right" --> "You're right."

 

 

Reply

 

Your = the possessive of you, as in, "Your name is Tom, right?" or "What is your name?"

 

You're = a contraction of YOU + ARE as in, "You are right" --> "You're right."

 

 

Reply
post #515 of 1290
Quote:
Originally Posted by Bergermeister View Post

And what about the old cheese-grater Mac Pro?  The "pros" out there tend to be far less adaptive/receptive to change than the ordinary Joe, as evidenced by the incessant bashing of the new MP design by said "pros".

I think you're right if you mean the innards. The looks however doesn't seem to be any issue, looking at a few friends of mine. They simply bought a new MP when they needed it, didn't matter that the model was outdated or overpriced or whatever. They just need to get the job done, whatever the box looks like.

Loosing the internal HDD's and expandability might change their views, don't know yet. I haven't heard someone going to buy the new Mac Pro, yet.
Send from my iPhone. Excuse brevity and auto-corrupt.
Reply
Send from my iPhone. Excuse brevity and auto-corrupt.
Reply
post #516 of 1290
Quote:
Originally Posted by Bergermeister View Post

And what about the old cheese-grater Mac Pro?  The "pros" out there tend to be far less adaptive/receptive to change than the ordinary Joe, as evidenced by the incessant bashing of the new MP design by said "pros".

 

The bashing from pros I've seen has nothing to do with how it looks, it's about changes to the hardware paradigm. It could look like a monkey scratching its armpit and pros would still buy it as long as it performs well.

 

The fickle consumer market is MUCH more concerned with outward appearance than the pros who shove it under a desk and never think about it again until it farts.

post #517 of 1290
Quote:
Originally Posted by Bergermeister View Post


And what about the old cheese-grater Mac Pro?  The "pros" out there tend to be far less adaptive/receptive to change than the ordinary Joe, as evidenced by the incessant bashing of the new MP design by said "pros".

Much of the whine from the so called "pros" is due to ignorance of the electronics industry. If you follow my posts I was pretty much convinced that the Mac Pro would end up smaller simply because there is no advantage to a big box and frankly a few disadvantages. I didn't t honestly expect what we got with this update though as I was expecting more of a cube of somewhat conventional design.

Another way to look at this is that Ive been around a long time and every major change in the industry has meant with the same bashing. It is really an issue of people not liking change more than anything.
post #518 of 1290
Quote:
Originally Posted by v5v View Post

 

The bashing from pros I've seen has nothing to do with how it looks, it's about changes to the hardware paradigm. It could look like a monkey scratching its armpit and pros would still buy it as long as it performs well.

 

The fickle consumer market is MUCH more concerned with outward appearance than the pros who shove it under a desk and never think about it again until it farts.

 

 

I wasn't talking just about looks.

 

And regarding shoving it under a desk, yeah, most PCs look so bad that they need to be shoved under a desk.  Others were too big to fit on the desk (my old Mac Pro).  

 

This new machine is a thing of beauty that belongs on many desks.  Some will still want or need to place it under the desk (or on ashelf to the side) but . . .


Edited by Bergermeister - 7/2/13 at 10:04am

 

Your = the possessive of you, as in, "Your name is Tom, right?" or "What is your name?"

 

You're = a contraction of YOU + ARE as in, "You are right" --> "You're right."

 

 

Reply

 

Your = the possessive of you, as in, "Your name is Tom, right?" or "What is your name?"

 

You're = a contraction of YOU + ARE as in, "You are right" --> "You're right."

 

 

Reply
post #519 of 1290
Quote:
Originally Posted by PhilBoogie View Post

Loosing the internal HDD's and expandability might change their views, don't know yet.

One thing that people often forget is that these things are going away eventually. Hard drives are legacy technology. Some would suggest that even if SSD gets cheaper, HDD will get even cheaper but that's not necessarily true. The bulk of the volume is in the consumer side. If you look at Seagate for example, they ship around 52m HDDs per quarter to consumers and 6m to enterprise with an average selling price of $63. They sell 58m out of 140m total drives sold. Western Digital sells around 56m to consumers and 6m to enterprise, totalling 62m units, average selling price $62. Between those two companies, that's pretty much the whole industry.

These numbers include drives shipped by the likes of Apple and these sales in fact makes up the bulk of the volume - about 40m each for Seagate and WD. The whole PC industry ships about 85m units per quarter. With over 70% of the shipments going to laptops, this is going to change to SSD entirely so Seagate and WD will be left with just the enterprise and externals. This might explain why Seagate bought LaCie last year.

Computer manufacturers can try bundling 2-4TB drives with machines but the vast majority of people have no need for that amount of space (and this is evident from the fact that WD and Seagate sell ~10m external drives per quarter) so gradually they will lose the consumer volume. The point where I think HDDs will collapse completely is $0.10/GB because 2TB for $200 is affordable. Right now, the best price is around $0.60/GB.

Physical drive size isn't as much an issue with SSD because there's no mechanical design and tolerances to work around and they can already fit 16TB into a single PCIe drive. It's really just price. You can imagine one day, they'll have single 16TB 3.5" external SSDs. They had 4TB last year:

http://www.anandtech.com/show/5322/oczs-4tb-35-chiron-ssd

There's not much point in designing a machine with 4x 6Gbps SATA bays when a 20Gbps TB port will run the SSD faster and many people will eventually just need 1 external vs a RAID drive.

Flash prices have clearly dropped and this can be seen in the Air, which moved from 64GB to 128GB on the entry model. This should give Apple the freedom to cut $100 off the Retina MBPs rather than increase the storage. If they squeeze the margins a bit, they can get the 13" rMBP down to $1299 and drop the old model. It wouldn't be good to start the 15" at $1999 but I'm sure they'll figure out a way to hit $1799, even if that means using Iris graphics or just a dual-core CPU.

This move would eliminate hard drives from their laptop lineup, which is 75% of Apple's machines.

I reckon they'll start to transition the iMacs to SSD too with a 256GB entry point. The Mini would probably retain an HDD in the base model but it could move to a SSD + HDD design. The new Mac Pro has the same physical SSD size as the Air but the 10nm NAND will hopefully allow them to go to double their previous max of 768GB to 1.5TB. With just another fab, they'll manage 3TB internally. They have plenty of room to make a wider SSD though if they wanted more.

As for timeframe, I'd say it'll go below $0.20/GB in 4 years. This would allow 3TB internally for $600. HDDs would still have a cost advantage but as I say, I think by this point, most if not all manufacturers will be using SSDs in shipped machines, which will cut Seagate's and WD's HDD volumes dramatically. Seagate even makes an SSD now so they know where things are heading.
post #520 of 1290
Quote:
Flash prices have clearly dropped and this can be seen in the Air, which moved from 64GB to 128GB on the entry model. This should give Apple the freedom to cut $100 off the Retina MBPs rather than increase the storage. If they squeeze the margins a bit, they can get the 13" rMBP down to $1299 and drop the old model. It wouldn't be good to start the 15" at $1999 but I'm sure they'll figure out a way to hit $1799, even if that means using Iris graphics or just a dual-core CPU.

It can also be an issue of yields. They have to be able to produce the number they would sell at $1799. I don't think they will go regressive on core counts. The flow in terms of bundled specs has been pretty consistent, and the high end dual cores aren't that cheap either. Integrated graphics actually sounds more likely to me as the notebook lines have always reduced gpu configuration first as a budgeting measure. Anyway


Edited by hmm - 7/2/13 at 1:00pm
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Apple throws out the rulebook for its unique next-gen Mac Pro