Are the GPU boards soldered onto the motherboard, it would be really interesting to yank out one out and see how well it performs when put up against a ATI counterpart. The more I look at the 4,000 dollar model the more I want it. I've recently been offered 2,800 for my custom HP Z from a friend of my husbands and I could sell my iMac easiliy for another 1500 so I think I'm going to do it. I'll take the extra money and will buy another 30" Nec monitor for a total of three, plus up the memory to 64GB using third party RAMS of course, Apple charges way to much for theirs. When does the pre-order site come online?
What do you guys think, better than having just one 4k monitor and cheaper too. Not to mention the coolness factor will be off the charts. Finding a wallpaper is going to be a bitch though.
The 27" displays are decent too, and much cheaper in the US (sub $1000). IIRC the 30" variants are from the prior 90s series. As for wallpaper, make your own! Paint it, rent digital medium format, or get a pano kit. I tried the Really Right stuff pano kit a couple years ago. It's expensive, and if you use a heavier camera (old 1Ds MK II w/L lenses, so roughly 5 pounds total) get the gimbal version. The locks are too small on the older one. For lighter gear there are probably cheaper options, but none are really cheap. It would make for some awesome untiled wallpaper. I'm somewhat disappointed by pricing, but there's a chance I'll grab a refurbished model once those start to show up. It will be some time before that happens. It's not that hard to do the pano thing though if you center the pivot correctly and shoot a couple references for distortion and lens falloff. It's a little bit of work. Without that the newer generations of dslrs would still cover 7680 across.
10 bit panels do exist, meaning 2^10 combinations per channel rather than 2^8 so the total possible output colors would be 3(2^10) rather than 3(2^8). What you may not realize is that these things will not expand much in gamut without drastic changes. As of right now we view in gamma corrected spaces, which are partly inherent to the electronics, yet overall provide a decent method of viewing quantized images over a low dynamics range. The biggest current effect is better shadow detail and an elimination of banding that can otherwise crop up as displays age as well as a reduced reliance on hardware applied dithering (not sure if that's the right term, but I'm referring to internal display calculations). Apple has repeatedly stated they have no plans to add driver support for 3rd party 10 bit displays, but I'm sure when they come out with one, they'll make sure to advertise it everywhere.
I know these higher color displays exist, but I'm talking about filtering that technology down to the masses in average computers and mobile devices. Right now they are focusing mostly on number of pixels and at some point, our eyes can only see so much at various screen sizes and distances. They are playing around in research labs with 8K video for replacing 4K at some point for movie theaters, but that won't hit the consumer market for a VERY long time. Heck, most people are STILL using 720p and moving towards 1080p and 4K for the home.
One thing I know is when Apple or someone else makes a statement that they will "never do this or that", chances are they might at a later date, when it's feasible, cost effective, etc. etc. and it makes sense to do it.
I read an article from a propionate expert in the field of digital converters and psychoacoustics that said that 16 bit audio was just fine back in the beginning of digital audio. Years later, he switched and went public saying that they are able to hear vast differences in 24 bit and they are STILL learning more about actually making better equipment to capture audio at 16 bit levels, 24 bit levels, 32 bit and higher. DSD vs PCM, etc. These arguments over better and higher resolution audio and video will continue as different and better measurement techniques are found.
I have a friend that used to sell the expensive Minolta color measurement systems many years ago and he used to consult with computer companies. He actually did some work helping NeXT with their monitors and recently with Apple. For all I know, he could have helped Apple with their calibration measurements they are using now to make their monitors.
I haven't talked to him in years so but I don't know what his thoughts are on the subject, so I'm just reading various articles by various people on the possibility of more colors becoming the next trend.
What I mean by 8K for the home, I'm referring to Kipnis because I'm sure that guy would get 8K to replace his 4K set up. Why? He's the only person so far that has a $6 Million home theater. http://www.kipnis-studios.com/The_Kipnis_Studio_Standard/Kipnis_Home_Theaters.html
and there are other wealthy video geeks that will try to outdo him at some point in time. :-)
The GPU boards are plugged into the PCI slot on the main board, but they are affixed to the heat sink and MOST LIKELY not user replaceable since they probably have thermal paste used.
Those GPU cards on MacPros are not user replaceable and they are custom designed by Apple and not a standard off the shelf card design. I don't think Apple is going to offer replaceable GPU cards for after the initial sale. I know this rubs some people the wrong way as some people buy a computer one day and then replace the GPU cards it came with a year or two later.
But the MacPro GPU cards are plugged into the main "backplane" board that sits on the bottom of the unit just like the CPU cards do.
I think it's 6BG per GPU for a total of 12GB. Isn't that what the original sneak preview said?I wish they were able to pump out the 12 core model in December. I would love to see all of the speed tests demos from each model side by side.
Are you referring to Minolta's color analyzers and the accompanying software? Good factory calibration really helps, but displays drift so much that these companies also need a solution for maintenance issues on the customer end as well as correlation. I really liked Eizo's CG211s. I wish they still made them. Used isn't very practical, as these things all eventually stop receiving software updates and colorimeter support.
I get that. For what it's worth I've owned several generations of them (sony artisan-->NEC Spectraview 2190-->CG243W), so I do have some direct experience. I also know what parts are rebranded hardware. There are many points that would need to be updated for optimal results, including manufacturing tolerance. It would be ideal if they could get away from technology that requires a backlight, given variation in manufacturing. Even among 10 bit displays, it doesn't guarantee a better display. Some of them use just as much obvious dithering. The reason I wish Apple would support 10 bit displayport really relates to shadow detail. Gamma 2.2 has inherently few values allocated to deep shadow regions, and it really does help.
He was selling color measurement tools during the early 90's. he used to go to Stanford when the professor would ask him to go and give presentations on color measurement systems. I don't remember exactly what tools they had, but he sold the expensive models, plus he sold other brands of expensive test measurement equipment to various companies, some were computer/monitor mfg, some were companies that made products where they had to ensure accurate and consistent color for products they sold. Etc.
I never asked him all of the different products he sold, but he would have one of the Professors go to my friend when the Professor didn't know the answer.
Yeah, in ANY discipline new ways of doing things, new ways to measure, etc. emerges by various companies and what people THOUGHT was the correct way of doing things one year would change 10 years later.
If my memory serves me, I think the equipment he sold was around $10,000 or so that they would use for color measurement in the factories for monitors, but I'm sure different and better stuff has come out since. The measurement tools they sell to the masses for a couple of hundred is probably pretty crude in comparison.
We would have to talk to my friend to get his opinion or someone that has a high level of understanding, so that's all I can say about that.
I'm not current on what is out there, nor have I played around with it. It's just that there are further development in monitor technology and some are researching more colors being displayed as the next technology jump aside from ppi.
Oh, I remembered an article I read regarding the high end reference level projection systems. There's a mfg called Meridian that bought a projector company called Farouja and they have this $200K+ projector that does actually scales to 4096 x 2400. They use these at film production companies for their theaters to evaluate the films before they release them. I read that they send someone out to calibrate the system, they bring expensive test equipment (brand/model I don't know) and I think the entire calibration process takes 3 days. I think it's included in the price of the projector (I would hope for that kind of money).
But yeah, the backlit monitors will eventually change. That's why I'm a little excited about the IGZO technology and am waiting to see what it actually is and how good it looks. I've heard rumblings the new displays from Apple that haven't been released yet might be using these. We'll see what happens when Apple releases the 4K displays I'm sure they'll release.
There is technology called non-volitle memory and a company called Intersil makes these E2 Pots which were used a LONG time ago to replace traditional potentiometers in monitors as they were more reliable. I think that's what Apple is still using as they did use them back a LONG time ago shortly after they came out. It apparently helps keep the monitors consistent and helps with the calibration. Again, I haven't sat through Apple's calibration processor know every chip inside, but I do know they did start using those pots a LONG time ago even back when they were using CRTs.
I really wish they would have just upgraded the old model.
Kept the card slots, drive bays, dual CPU option, upgradable graphics cards... everything in a nice enclosure.
Would have been easy to add SSD, TB2, and USB3.. would still have ports on the front... could stay on the floor...
Here's the problem with doing it the old way, as you suggest. They would have to charge more money of the configuration since it would a LOT more expensive way to make it. It wouldn't perform as well if they gave you only one GPU standard, and they would have to probably add fans and potentially water cooling which is even MORE expensive for the 12 core model with the higher end GPUs.
Sorry, but I understand your frustration because you are used to a certain paradigm, but the fact is it would be more expensive for the same performance. I wish they used a rack mountable/tower configuration myself since a lot of people would like to put them in racks in studios or mobile location use and a rack mount works great for that, but it would add a lot of money to the cost and to the end user.
This way, you pay for what you need. I know lots of people that never add PCI cards, so why should those people pay for them or pay for internal cages, when they use external drives and SANS storage. Bottom line, is the new way is just overall probably the best at this point in time considering the pros and cons of the new case design.
It truly doesn't make much of a difference.
Chances are, within the first year someone's going to come up with an add-on that sits below the ProCan, and connects to card slots, drive bays and a Blu-Ray player.
Actually, the Blu-Ray burner in this form factor already exists.
It's only a matter of time.
No need to do that. I have used macs in this way for years.
Plug in all IO cables to the mac applying power last. Keyboard should wake the mac so you can login. Reverse with power first followed by all other IO to disconnect and take your mac with you. If it goes to sleep or not is a controlled by power savings panel as you mentioned. You can have it go to sleep and wake it with keyboard, etc. using this connect/disconnect pattern.
Actually Apple engineers could easily create a new motherboard for the existing platform, there are many XEON boards that currently support Dual, Triple, Quad, heck there are even ones that support Hepta(7) for GPU servers. Just because the older motherboard didn't support multiple linked GPU's doesn't mean that Apple couldn't do it now, heck SLI has been around for more then a decade. Apple could haves easily added it to the last generation of Mac Pro's. I'm actually dumbfounded that this new Mac Pro is their first attempt at it. The new XEON processors also run cooler and have less wattage requirements then the previous generation. I miss the days where company's like Sonnet made upgrade boards for Apple products. This would be a perfect platform to upgrade, just remove the motherboard for a newer one, it would support 2 Xeon processors, possibly more than 2 graphics cards, more memory and still have a few PCI slots for those who need it. It would also be less expensive, the smaller the computer get's the more expensive it is to manufacturer, The starting price for the last generation is less expensive then the current one, why do assume it would be more if they stuck with current case.
Can I ask you why you keep saying that these type of machines need water cooling, outside of the modding community very few company's use or need it. I'm running 4 graphic cards now, 2 Nvidia Quadro 4000's in a SLI configuration (meaning their linked together as one) and two Tesla M1060S (these are for rendering), my machine is very quite, you wouldn't even know it was on unless you put your head under the desk. The CPU's also require more wattage and run much hotter then the 12 core CPU's, plus I'm pushing 4 GPU's, the system is always in the green heat wise and I'm only using fans. You really only need water cooling for computers that have been overclocked, again the modding community.
Here is the motherboard for my HP Z800, as you can see it supports 4 PCIe x16 GPU's.
Here is the case opened, as you can see no liquid cooling even with 4 GPU's and 2 XEON preocessors
Here is the reason why it's quite
This is what the Quadro 4000 looks like
....and the Tesla M1060S, the Tesla cards are passively cooled so two less fans to make noise.
I bought the GPU's and 64GB ram separately, but they are all HP stamped products that were designed for the HP800, nothing in the machine is mismatched. The GPU cards are also single slot which means more space between them allowing for a cooler setup.
Before you start in on me DRblank, I just posted my computer as an example of a XEON machine that has a lot more going on than the current Mac Pro and it doesn't require fancy cooling and that XEOn machines can easily handle multiple GPU's. Plus as of yesterday I no longer have it, I sold it for 2800 to a friend of my husbands. If I get out of the hospital I'm going to buy the new Mac Pro with D500 graphics.
Is that your rig with x2 Tesla cards?
Lemon Bon Bon.
Yes, it was an eBay purchase. I first bought the HP800, then I upgraded to two 2 Xeon X5660's, then one Quadro 4000 card, then one Tesla and so on so fourth. I slowly updated the machine over a course of one year. I hunted down and used nothing but HP parts to built it except for the CPU's which I got out of a IBM server. The last upgrade I did was put in 3 600GB 15K SAS HP stamped HD's but I think their Fujitsu drives. None the less I paid a total of 2800 for it when all was said and done. The machine can handle anything thrown at it, I am yet to see it stutter. The main OS is CentOS but I have WIndows 2012 Server R2, and RedHat in a triple boot. Mars is incredible on it. It was a fun machine to build but now that it's finish I will start all over again and build an even faster machine.
This reminds me of a conversation I had with someone the other day. They mentioned installing Ubuntu. I replied, "Ubuntu is the diet pepsi of Linux. It's one calorie Linux, just not Linux enough." I still like Fedora.
$2800 total cost for parts, but 1,000 hours of labor x $75/hr equals another $75,000. Yeah, cost savings. I love how tech people COMPLETELY miss the point that they don't really calculate all of the hours they waste and how much per hour that time is and add that to the cost of the total system.
And to do what? What apps are being used? To play games?
You paid $2800, sold it for $2800 and wasted a LOT of time, energy and lost opportunity to do something productive to do what? to go off topic. That system STILL doesn't have Thunderbolt, can't legally run OSX and run Final Cut Pro. WHAT A FREAKING waste of time.
Relic, you really need to go to another site and waste someone else's time.
SERIOUSLY. It's too bad I don't operate this site, as you would have been booted off YEARS ago.
Having technical knowledge to do this isn't that big of deal, but what is a little more difficult is using common sense and business sense, for which you have NONE.