Originally Posted by Marvin
It helps avoid banding on higher resolution displays, which wouldn't be as noticable on lower resolution displays:
"After seeing native 4k presentation on my 96" 21:9 screen viewed from 10' away, I can truly say that I welcome 4K wholeheartedly, but the colour banding and motion artifacts remain extremely visible"
The colour range advantages are important for HDR photography and other high colour range media as you were saying. That'll likely be the next marketing term: HDR.
I haven't viewed 4k displays, so I can't really comment there. HDR is interesting, although it's often used in really bad ways. Given that I have a reasonable background in that area, I've been shopping for a decent pano kit. The old one won't support a Canon 1 series body, although I could switch to something lighter. Spherical HDR panos are very useful for producing realistic reflections in renders, especially if you want to place something at a specific location. It's just annoying finding something with the locks/clamps appropriate to support a few pounds of camera.
Anyway back to color gamut. The thing is that a display has a certain matrix of values that can be addressed. It's not one solid shape as it's portrayed if you bring up a gamut preview in colorsync. It's better to think of the device behavior as a point cloud. Given that the larger volumetric gamut means these points are spread thinner, it's natural to migrate toward the use of more points to at least partially alleviate the need for dithering. Buying a 10 bit display currently isn't an all encompassing solution to the dithering thing. People have noticed it in Dreamcolor displays and other super expensive ones in the past few years. It's just that it does need to be adopted if the trend is toward expanding gamuts. Right now they basically cap out at Adobe RGB. This has been possible for at least a decade, but they're much more common now. Apple has kind of ignored this and stuck within sRGB. It's their choice, and it may have been partially motivated by their focus on thunderbolt, as it doesn't fully support displayport 1.2.
In what way? Cinebench is double vs the old model, GPU in the quad i7 scores more than 9% vs the base model.
I just looked at the links again. I see what you were comparing now. The cinebench cpu test on the mid range model came up considerably, as it was bumped to a quad i7. The others aren't showing quite that advantage, although the mathematica gains are quite impressive.
2012 Mac mini: Individual application scores
|Mac mini/2.3GHz Core i7 (Late 2012)
|Mac mini/2.5GHz Core i5 (Late 2012)
|Mac mini/2.5GHz Core i5 (Mid 2011)
|Mac mini/2.3GHz Core i5 (Mid 2011)
|Mac mini/2.4GHz Core 2 Duo (Mid 2010)
If it had a 640M and a quad-i7, it would have been an instant buy. I played Battlefield 3 on my current one no problem and I reckon it would just be too slow with the HD4000.
BF3 on the HD4000 looks like this (choppy on low):
On the 6630M, it looks like this (steady 30FPS+):
I'll probably just wait out Haswell. I have a feeling it will arrive around June so not a long wait. I don't want the headache of an OS upgrade right now either. I'm getting concerned that Apple doesn't seem to want to bring feature parity to Quicktime X and seem to be happy to break things along the way so I'm overly cautious about adopting the new OSs.
That makes sense. They did reallocate costs somewhat. I thought the cpu pricing was closer together, but I just looked it up. I checked intel's site and wikipedia, specifically because they list launch pricing as opposed to current pricing, and it's always been accurate on this.
Recommended customer pricing on the former was $150 cheaper on the 2011. I have no idea what the cost of a 640m looks like, but these listings are the same as they were at launch. It would have been nice. This is a downside to using mobile innards. Components with equivalent performance come with a pricing premium. If 77W was doable, there are a couple decent upper range i5s around $225, which lines up with the cpu cost they were using last year. Unfortunately that wattage is still too high. In my opinion they made it too restrictive in favor of size. Given that Apple maintains a very lean lineup, it would have been nice to see a slightly better range of use cases.