The advantage to USB3 moving to 10gbps is that between having faster USB3 and a Thunderbolt breakout box, 95% of users will be fine.
I really do think that at least one Pro option will lack any slots at all.
Apple has been waiting years to do something like this, and now they have the chance. They will take it.
I expect one of these ports to end up in a form that can be used in notebooks, phones, and tablets. That is really what is needed when it comes to something appropriate for consumer devices going forward. Regardless of what the port can handle, they still have to feed it PCI lanes, so these things still rely on sufficient IO chip bandwidth.
I think dual processor is just too expensive. The entry point is around $4000 now.
I want it to get back to this sort of thing with a single CPU:
I'm not thrilled with the aesthetics there but that's the general idea. It could have a side access panel again but it creates more superflous design and I like that this doesn't need additional parts to lift it up. The insides can be lifted out from the base so it's more seamless. The space at the bottom is for carrying it and for air flow. When it's flipped upside down, it would just have a 180 twist handle and you pull the insides out. This means you can't open it while it's on.
The current Mac Pro doesn't have hot swappable drives but they could do this from the back similar to Lacie drives if they wanted. I'd rather see dual Fusion drives though and hot swapping isn't so good with that.
When I see ads like this, I like that better than where the Pro is now:
It just feels more modern and now it doesn't have to sacrifice any functionality. The ports would be on the back and it would have a proper cooling system. I'd rather see quad 20Gbps TB ports with special connectivity but a single PCIe 3 expansion slot will offer more bandwidth and more hardware support.
The bundled GPU would be a midrange card and you'd have the option to go to the highest-end card.
It would start with 6-core Ivy Bridge EP, 8GB RAM, 1GB 8770 for $2499 (£2049)
It would go up to 10-core, 16GB RAM, 2GB 8970 for $3999 (£3299)
If there is a 12-core, it would go up to that but 10-core IB should be a bit faster than the 12-core from 2010.
That's sex talk, Marv'. You know how I feel about the Cube...
I don't agree with your pricing. Far too pricey to get 'mini tower' units moving. I think we take the monitor off the iMac. Drop the price £800 from the top end iMac (ie minus the studio display...) and you're left with a consumer tower £1200 with 680Mx gnu and an i7. My top end iMac with 680 Mx and fusion minus display? £1200-1400 for a Cube. For an entry 4 core and 6 core next single cpu model up. 10 core? 2k and over.
Time to add value proposition to consumer and prosumer and pro workstation cube to get unit sales going.
Part of me still feels the iMac has eaten the concepts lunch to a degree. The other feels it's still valid IF they can make it more of a value proposition.
2k and up? No. Too pricey and out of time. Too rigid. 2k for a cube? That's worse value than the original Cube. Original Cube pricing or lower with i7 consumer processors. One quad. One six core. Rational £1200-1400 pricing. More than fair when you think you don't get the studio monitor. Add it and you're looking at 2k. Pricier than an iMac.
10 core and higher? 2k and higher.
We'll see what Apple do with the Mac Pro. They've made the iMac pricier over time.
2k and up? No. Too pricey and out of time. Too rigid. 2k for a cube? That's worse value than the original Cube. Original Cube pricing or lower with i7 consumer processors. One quad. One six core. Rational £1200-1400 pricing. More than fair when you think you don't get the studio monitor. Add it and you're looking at 2k. Pricier than an iMac.
10 core and higher? 2k and higher.
From the buyer's point of view you're right, that lower price point would be much better with the i7 and 680MX. But Apple knows what would happen. People just end up getting a 27" Dell:
so they just lose the margin on the display. Selling both together means you have to take Apple's display in that price range. I'd like to see them be a bit cheaper with the Cinema Displays now. You can see from the following list, a lot of companies use very similar panels even between NEC and Eizo:
Their $999 display would be better off at $699 or $799 and they can make up some lost margin in increased volume. A more comparable Dell would be the following as the above is 1080p:
but we know Dell's margins on things are far too low so maybe the same goes for the displays.
If we assume they have 40% gross margins on the iMac, that means the $2200 model costs $1320. The $999 Cinema Display would be $599 but we know that includes the chassis and parts so the panel itself would only be a part of that. I'd say $400 is a fair estimate for the panel cost.
That would mean costs for the mid-range headless i7 machine would be $920 and to make the same margin, sold at $1533 or rounded to $1599. That's what they used to sell the G4 towers at.
The big problem there for Apple is that undercuts the entry 27". While they'd be making the same margins, it probably won't make many more sales than the equivalent iMac so Apple is losing the extra amount from the sale of the panel.
The best they could do is sell it at $1799 like the original Cube because that way there is the choice between a more powerful machine for $1799 or a slower iMac with a 27" display for $1799 and they make an even higher margin on the headless model.
I'd imagine the Mac Pro parts won't be too far off that $920 price but maybe add another $300 for the motherboard and PSU. That would make the gross margin 50%. If it was 40% like the other models, it would be closer to $2099. The cheapest they went was $2199 (~45%) so really they might have only increased their gross margin 5% putting the entry level at $2499.
I think it's a good idea for them to use the Xeon as it gives them the option to go to 12-core chips with the same motherboard. If they go with a $2499 6-core, they could maybe have an entry quad-core at the old $2199 price. It does seem like poor value next to the $2199 iMac but you'd be getting the option for more RAM, possibly a half-length PCIe 3 expansion slot or two, a replaceable GPU and easy access to storage.
If we assume they have 40% gross margins on the iMac, that means the $2200 model costs $1320. The $999 Cinema Display would be $599 but we know that includes the chassis and parts so the panel itself would only be a part of that. I'd say $400 is a fair estimate for the panel cost.
That would mean costs for the mid-range headless i7 machine would be $920 and to make the same margin, sold at $1533 or rounded to $1599. That's what they used to sell the G4 towers at.
The problem here is beyond what you mentioned. You have a greater level of conflict with the imacs in price point, and you've isolated the dual socket version on components. The margins on that are likely huge for little extra R&D. On the windows side it's the same. The return on the dual models is much higher. Apple is always trying to reuse components, so the current strategy seems better aligned with that. You might remember they used to ship dual socket versions at $2300-2500. Unless their costs skyrocketed or margins there were pathetic, they must be extremely high now. If they wanted to placate the xMac crowd, they could have solved that problem when they redesigned the imac by making it a little less hostile to customization and using extra space to contain storage rather than thin out the edges. This would make more sense to me than trying to morph the mac pro into a consumer box that bisects the imac pricing tiers. On displays, there's a lot Apple could do there. Since you mentioned NEC and Eizo, they engineer around certain panel deficiencies whenever possible. They also seem to bin things based on performance. That is likely not feasible for Apple as they don't have the tiered markets or corporate display sales to sink cheaper units with panels that didn't make the specs necessary for their most expensive lines. Intelligent compensation for drift based on panel hours, methods of blocking to improve edge to edge uniformity (see colorcomp and due) are things that would be practical. Apple likes things that are done behind the scenes, and these two vendors are largely that way. Their software involves setting targets and pressing a couple buttons. Uniformity compensation on Eizo is handled without user input. I think NEC went this way as of 2010 as well (not 100% on that). Anyway my point being that there's a weird desire to shoehorn one thing into another due to the lack of an updated model. I don't see them homogenizing to that degree, especially when it represents one of their slower growth points.
Quote:
I think it's a good idea for them to use the Xeon as it gives them the option to go to 12-core chips with the same motherboard. If they go with a $2499 6-core, they could maybe have an entry quad-core at the old $2199 price. It does seem like poor value next to the $2199 iMac but you'd be getting the option for more RAM, possibly a half-length PCIe 3 expansion slot or two, a replaceable GPU and easy access to storage.
This seems really silly to me. If they stuck to the same margins, you'd still see a quad core at $2500. If you split it into the X79/i7 route, you'd lose the ability to reuse parts with the current daughterboard setup. Half-length makes even less sense unless a lot of what is currently used fits into that factor. It's a smaller market, so manufacturers aren't that likely to heavily refactor things. It's also not what they've used in gpus.
I'm just going to add that if the imac was better aligned for me in a few areas, I would probably go that route (as in the top one).
Since you mentioned NEC and Eizo, they engineer around certain panel deficiencies whenever possible. They also seem to bin things based on performance. That is likely not feasible for Apple as they don't have the tiered markets or corporate display sales to sink cheaper units with panels that didn't make the specs necessary for their most expensive lines.
Once again facts are better than imagined problems. Apple does reject panels that don't meet an A grade. Cheaper brand manufacturers use their rejected A- panels:
"there was already a cult following for these Korean monitors on various forums throughout the internet. Quickly we found out that these monitors were using the "A-" stock of LG LM270WQ1 27" panels. While the current Dell Ultrasharp 27" U2711 displays use a different revision of the LG panel (LM270WQ2) this is in fact the same panel Apple uses in their 27" iMac and Cinema Display/Thunderbolt Display."
Intelligent compensation for drift based on panel hours, methods of blocking to improve edge to edge uniformity (see colorcomp and due) are things that would be practical.
They would be practical but you can always buy another brand for that level of control. Apple uses high grade, calibrated panels so when you buy them, you're getting a high quality display. I'd prefer that they provided 3 year warranties on their displays but manufacturers don't make money giving customers hardware that lasts a long time. If Apple's displays go bad after a period shorter than 3-4 years, it's a real problem, otherwise it's another imagined problem.
This seems really silly to me. If they stuck to the same margins, you'd still see a quad core at $2500.
Ultimately their aim is to maintain the net margins. They can reduce material costs, shipping costs, hit a higher volume of customers etc. They can adjust gross margins slightly and maintain net margins.
I'm just going to add that if the imac was better aligned for me in a few areas, I would probably go that route
You mean if it had dual Xeons, a top of the line Quadro with 4GB of video memory, 4 full length PCIe 3 slots and manual display calibration? They might have just a little trouble fitting that all into an attractive AIO design. Even if they did, I'm sure you'd find something wrong with it like 'thermal issues' causing severe unusable throttling or a fractional brightness shift they haven't accounted for.
Apple has never really offered that many PCI slots. You'd either end up with multiple machines, a different brand, or a different solution somewhere in there. I'm not questioning your requirements, just whether you'd be able to fulfill them that way with a Mac Pro.
My Mac IIx had 5 slots. I usually used a Magma expander.
It doesn't matter because all you need is a breakout box for full length cards. Most expansion cards seem to be half-length full height.
You mean if it had dual Xeons, a top of the line Quadro with 4GB of video memory, 4 full length PCIe 3 slots and manual display calibration? They might have just a little trouble fitting that all into an attractive AIO design. Even if they did, I'm sure you'd find something wrong with it like 'thermal issues' causing severe unusable throttling or a fractional brightness shift they haven't accounted for.
I didn't mean anything of the sort. The hostility is truly unnecessary. I enjoy these discussions when they remain discussions and don't disregard previously established points in their entirety. I suggested that the imac could have picked up some of the "xmac" features as opposed to a desire to morph the mac pro into one. What I meant was possibly an extra storage bay. 2.5" storage seems to become better each year if 3.5" is too large. Some kind of hard drive access. We can't assume everyone will buy Applecare, and swapping out a flakey hard drive shouldn't be such a big deal. You probably recall ifixit's removal of adhesive. I think they were after a certain look and wanted to avoid seams, but that in itself is a choice. In my opinion it's one of form over function. I didn't suggest morphing the imac into a mac pro. As for the topic of vram, it's possible to get that in the typical gaming variant too. Interestingly NVidia cut back on the optimizations for computation in their gaming gpus this round, but the ram would still help. If you're locked to one oem, you do give up a certain amount of flexibility. My point here was that if some of the smaller details regarding storage and service were a big deal to them, they could have designed these things into the imac. Since you've referenced a few editing facilities, my guess is they would probably hook up a thunderbolt external boot drive in the case of a dead drive if it's time critical. That is what I would do. Sound reasonable?
Edit: I didn't mean they should use 4GB cards in the imac, although it should be mostly 2 in the 27", and 512MB is kind of ridiculous in the current year. My primary objection there is using video memory allocation to fine tune costs.
Quote:
Originally Posted by Marvin
Once again facts are better than imagined problems. Apple does reject panels that don't meet an A grade. Cheaper brand manufacturers use their rejected A- panels:
"there was already a cult following for these Korean monitors on various forums throughout the internet. Quickly we found out that these monitors were using the "A-" stock of LG LM270WQ1 27" panels. While the current Dell Ultrasharp 27" U2711 displays use a different revision of the LG panel (LM270WQ2) this is in fact the same panel Apple uses in their 27" iMac and Cinema Display/Thunderbolt Display."
That isn't what I meant at all. Dell and the major brands don't use A- panels. The Dell version is CCFL backlit. This was typically preferred especially as colorimeters hadn't been updated a couple years ago and LED hadn't been fully tested. Because of that, other brands were a little more conservative. I don't think it was cost related, as many of the cheaper ones adopted LED early on. The cult following is because for general use, they aren't always that bad. LG has come a long way. The reason you don't really see other panel brands now is due to shrinking margins. Hitachi doesn't even make panels these days and they pioneered IPS.
Quote:
They would be practical but you can always buy another brand for that level of control. Apple uses high grade, calibrated panels so when you buy them, you're getting a high quality display. I'd prefer that they provided 3 year warranties on their displays but manufacturers don't make money giving customers hardware that lasts a long time. If Apple's displays go bad after a period shorter than 3-4 years, it's a real problem, otherwise it's another imagined problem.
All of the companies that we've discussed including Dell do basically the same thing. They calibrate them at the factory. This might not take place in $100-300 lcds. If you're buying at that price range, it is calibrated. It doesn't tell you what the pass/fail specs are like, It doesn't tell you how many points are validated. It's a very vague description where they didn't really tell you anything. It's actually important that the hardware values are set really well there, in spite of long term drift. Part of the reason is that consumer devices don't have the same level of accuracy. It's kind of silly when people marvel at low Delta E values when they're beyond the capability of the measuring device. It's like expecting a cheap ruler to be accurate to 1/100th of an inch. Even the really expensive ones can't match factory grade equipment. You know we've been over this before. I thought certain points were understood. As far as longevity goes, from an anecdotal standpoint I haven't seen the kind of problems with recent Apple displays that I did with the older generations. If someone asks about one, I check things like Apple discussions as well as some of the reviews. When complaints come up, the severity isn't the same as some of the older generations. I haven't seen the edges turn purple or colored flashing vertical lines.
Quote:
Originally Posted by scottglasel
My Mac IIx had 5 slots. I usually used a Magma expander.
I couldn't think of anything since the early 2000s with more than 3. The Magma expander makes sense. That's obviously a solution right there, assuming it meets bandwidth requirements. Aside from the beefier power supply, I'm a little puzzled that it's much more expensive than equivalent sonnet stuff. I've never seen a review comparison.
What I meant was possibly an extra storage bay. 2.5" storage seems to become better each year if 3.5" is too large. Some kind of hard drive access.
I'd have preferred the storage to be easily accessible or at least a smaller SSD-only option. I can understand why they did it because they'd want the storage to be secured and quiet as well as only bought from them. Regular external backups should be made though so that it's not such a big deal. It would be quite hard to transplant a Fusion drive anyway because the files are split between the SSD and HDD.
If you're buying at that price range, it is calibrated. It doesn't tell you what the pass/fail specs are like, It doesn't tell you how many points are validated. It's a very vague description where they didn't really tell you anything.
If they did that you'd be in the store for days looking through the test sheets to pick the best one though. Do any other manufacturers provide calibration data? You have to judge by the expected quality assurance of any brand.
It's actually important that the hardware values are set really well there, in spite of long term drift. Part of the reason is that consumer devices don't have the same level of accuracy.
What makes a $999 27" IPS LED backit Cinema display a consumer display and a $1200 NEC PA271W a professional display? If they do the same job out of the box, you save money buying from Apple. Also, there's theoretical accuracy and practical accuracy. You can measure specs as much as you want, you're never going to match colors on a display to every single possible surface an image is reproduced on, it'll always be an approximation.
It's good to go for the best available but people get into a mindset that the best available suddenly becomes the only option and anything else is unusable or just a toy for consumers.
As far as longevity goes, from an anecdotal standpoint I haven't seen the kind of problems with recent Apple displays that I did with the older generations. If someone asks about one, I check things like Apple discussions as well as some of the reviews.
People have issues with all sorts of displays though, you can rarely judge by a handful of reviews:
Designers, photographers and other graphics artists shouldn't be expected to be display technicians. The designers at Apple use their own displays so you'd expect they are calibrated properly. As you say, factory grade calibration is far better than user calibration.
Do any other manufacturers provide calibration data? You have to judge by the expected quality assurance of any brand.
What makes a $999 27" IPS LED backit Cinema display a consumer display and a $1200 NEC PA271W a professional display? If they do the same job out of the box, you save money buying from Apple. Also, there's theoretical accuracy and practical accuracy. You can measure specs as much as you want, you're never going to match colors on a display to every single possible surface an image is reproduced on, it'll always be an approximation.
NEC doesn't provide such a sheet. I haven't used them regularly in a while (my primary one isn't NEC at the moment). I've worked with both at a couple shops so I have a bit of experience with both. CRTs were still better in a few ways even compared to the newest ones, but I hated the flicker and this way most of the time it's one 16:10 (similar height) for me as opposed to 2 4:5 displays with one being a bit smaller. Some Eizos do actually come with a calibration sheet showing per region how far off it was measured relative to their test target. I can probably find and scan the one from mine. All of these things are approximative at some level. Most of them on the market started designed for prepress workflow requirements, and leveraged their marketing into video, web design, and whatever else from there. My point is they're not designed as broadcast displays. Those tend to be extremely expensive even today. Anyway if we're talking about photographers, it would usually be about being able to match a print sufficiently under fixed lighting conditions. Beyond that things like uniformity, reproduction of weird colors, and how well it matches a typical gamma 2.2 at all brightness levels comes to mind. Some displays start to look muddy at lower brightness levels, clip highlight detail, crush shadows. They're bad things. Uniformity helps for photographers and designers as subtle changes can really lie to them. Obvious backlight bleed might be obvious, but subtle tends to look like part of the image and definitely throws you off if viewing two versions side by side. Personally I think use matters more than specs, as no test is really perfect.
Quote:
It's good to go for the best available but people get into a mindset that the best available suddenly becomes the only option and anything else is unusable or just a toy for consumers.
I've never suggested that. In some cases (I'd emphasize some) I'd suggest one over a quad mac pro + cinema display. It depends on the requirements. I want to test an external thunderbolt boot drive, as I feel one would be essential there as a backup boot. Obviously that adds slightly to the cost, but anyone rational totals up a complete estimate for each solution prior to forming a decision. I still abhor the fixed height due to lack of VESA on the current one. A 27" might be just tall enough to where I wouldn't want to raise it, but their design changes didn't bring any new functionality.
Quote:
People have issues with all sorts of displays though, you can rarely judge by a handful of reviews:
I had a strange string of issues with the first NEC I owned. That would be how I know they are fast about replacements. Their engineers wanted to look at the problem unit. I'd also mention the difference in pricing. Whenever I've suggested them more recently, it has always been cheaper than $1200. A couple shops have them for $950. If I suggested one, I'd link one of them, but it might mean new ones are on the way this year.
Quote:
Designers, photographers and other graphics artists shouldn't be expected to be display technicians. The designers at Apple use their own displays so you'd expect they are calibrated properly. As you say, factory grade calibration is far better than user calibration.
That's the thing. The software doesn't treat you that way. We're not talking about X-rite's i1 proof or something of that sort. These things are basically automated and come with presets. They allow you track display drift to a degree, keep the display as stable as possible, and make adjustments to hit a particular set of target values. NEC and Eizo both have hour counters built in. They both talk about some hardware level of drift compensation, yet everything drifts at some point. It's important that Apple and the rest set the values properly at the factory, yet these are still unstable devices by their nature. They do not retain the same color, but the change is gradual. If you're looking at it regularly, you become somewhat accustomed unless it's way off. As they age the changes tend to accelerate. I don't just go through technical tests. The things should be easily viewable, and that doesn't mean as bright and high in contrast as possible. With fixed 8 bit data paths that would be annoying, as the smallest change you could make on screen would still be quite large (relatively speaking). Even without adequate controls, people try to tweak these things. Colorsync for example has built in calibration utility that tweaks the out values in a basic matrix profile per channel. Whenever anyone mentions it I tell them to pretend it doesn't exist. It's the worst designed thing ever, as its default is to overwrite the default profile. Most people don't know you can just delete that to restore the old one. The other problem is that profiles break really easily, and that provides extremely coarse controls with only visual judgement.
Anyway I'd personally say it takes both for maximum usability. On the end user end, the software should be accompanied by well written documentation and hide anything that shouldn't be touched by the end user. For example I prefer how color navigator handles uniformity completely behind the scenes as opposed to the colorcomp solution of on/off and levels in advanced user mode. Supposedly they made a lot of improvements to it with the PA series, but I haven't seen a really old one. My 90s series has an insane number of hours on it. It was really good when it was newer. Now it's just too old, which is why I replaced it. I still calibrate the thing and use it to hold references and things when I'm building textures.
Seeing as you're a wealth of references, what is your favorite book on C++, ideally one where the author doesn't have an infatuation with nested code?
Edit again: By the way, I personally buy from local resellers who agree to deal with the manufacturers if the need for warranty service arises.
I want to test an external thunderbolt boot drive, as I feel one would be essential there as a backup boot. Obviously that adds slightly to the cost, but anyone rational totals up a complete estimate for each solution prior to forming a decision.
A USB 3 drive would do the job. A 1TB would be about $80.
Seeing as you're a wealth of references, what is your favorite book on C++, ideally one where the author doesn't have an infatuation with nested code?
I don't think books are the best way to learn that. You'd be best getting a small program that works in XCode that does something useful and modify parts and then just use a book as a backup reference when you run into trouble. As for which book, I don't know what one might help the best. It often works out quicker Googling any specific errors in compilation.
At this point it is hard to recommend C++ books as the new standard changes things enough that you have search the net for contemporary answers to questions. If somebody is looking for a C++ intro Accelerated C++ is a worthwhile investment keeping in mind the new emerging standard.
Beyond that I don't see the net as the best place to develop C++ skills. Books like the Effective C++ series are a great help in developing a budding programmers skills. The problem is not all of these accepted works have been updated to C++11. It just isn't a good time to be recommending C++ books. As it is stack overflow use to have a thread on recommended C++ texts, see -->> http://stackoverflow.com/questions/388242/the-definitive-c-book-guide-and-list
There is nothing wrong with turning to the net for information on C++, but often you get mixed messages or poor advice. The nets usefulness is tied directly to ones ability to filter the crap from the gold. When you turn to widely accepted references at least you are talking a common language or maybe better said style.
Quote:
Originally Posted by Marvin
A USB 3 drive would do the job. A 1TB would be about $80.
I don't think books are the best way to learn that. You'd be best getting a small program that works in XCode that does something useful and modify parts and then just use a book as a backup reference when you run into trouble. As for which book, I don't know what one might help the best. It often works out quicker Googling any specific errors in compilation.
There is nothing wrong with turning to the net for information on C++, but often you get mixed messages or poor advice. The nets usefulness is tied directly to ones ability to filter the crap from the gold. When you turn to widely accepted references at least you are talking a common language or maybe better said style.
Thanks. That is a good idea. I had randomly come across one of Marvin's old threads on something C related (not C++, specifically C), which was why I tagged that in there. At the moment I'm sort of curious more than anything if a real product strategy is there. Numbers must be dismal at the moment.
Comments
Quote:
Originally Posted by Frank777
The advantage to USB3 moving to 10gbps is that between having faster USB3 and a Thunderbolt breakout box, 95% of users will be fine.
I really do think that at least one Pro option will lack any slots at all.
Apple has been waiting years to do something like this, and now they have the chance. They will take it.
I expect one of these ports to end up in a form that can be used in notebooks, phones, and tablets. That is really what is needed when it comes to something appropriate for consumer devices going forward. Regardless of what the port can handle, they still have to feed it PCI lanes, so these things still rely on sufficient IO chip bandwidth.
Quote:
Originally Posted by Marvin
I think dual processor is just too expensive. The entry point is around $4000 now.
I want it to get back to this sort of thing with a single CPU:
I'm not thrilled with the aesthetics there but that's the general idea. It could have a side access panel again but it creates more superflous design and I like that this doesn't need additional parts to lift it up. The insides can be lifted out from the base so it's more seamless. The space at the bottom is for carrying it and for air flow. When it's flipped upside down, it would just have a 180 twist handle and you pull the insides out. This means you can't open it while it's on.
The current Mac Pro doesn't have hot swappable drives but they could do this from the back similar to Lacie drives if they wanted. I'd rather see dual Fusion drives though and hot swapping isn't so good with that.
When I see ads like this, I like that better than where the Pro is now:
It just feels more modern and now it doesn't have to sacrifice any functionality. The ports would be on the back and it would have a proper cooling system. I'd rather see quad 20Gbps TB ports with special connectivity but a single PCIe 3 expansion slot will offer more bandwidth and more hardware support.
The bundled GPU would be a midrange card and you'd have the option to go to the highest-end card.
It would start with 6-core Ivy Bridge EP, 8GB RAM, 1GB 8770 for $2499 (£2049)
It would go up to 10-core, 16GB RAM, 2GB 8970 for $3999 (£3299)
If there is a 12-core, it would go up to that but 10-core IB should be a bit faster than the 12-core from 2010.
That's sex talk, Marv'. You know how I feel about the Cube...
I don't agree with your pricing. Far too pricey to get 'mini tower' units moving. I think we take the monitor off the iMac. Drop the price £800 from the top end iMac (ie minus the studio display...) and you're left with a consumer tower £1200 with 680Mx gnu and an i7. My top end iMac with 680 Mx and fusion minus display? £1200-1400 for a Cube. For an entry 4 core and 6 core next single cpu model up. 10 core? 2k and over.
Time to add value proposition to consumer and prosumer and pro workstation cube to get unit sales going.
Part of me still feels the iMac has eaten the concepts lunch to a degree. The other feels it's still valid IF they can make it more of a value proposition.
2k and up? No. Too pricey and out of time. Too rigid. 2k for a cube? That's worse value than the original Cube. Original Cube pricing or lower with i7 consumer processors. One quad. One six core. Rational £1200-1400 pricing. More than fair when you think you don't get the studio monitor. Add it and you're looking at 2k. Pricier than an iMac.
10 core and higher? 2k and higher.
We'll see what Apple do with the Mac Pro. They've made the iMac pricier over time.
But the 'Pro' pricing needs a brain check.
Lemon Bon Bon.
From the buyer's point of view you're right, that lower price point would be much better with the i7 and 680MX. But Apple knows what would happen. People just end up getting a 27" Dell:
http://www.amazon.com/Dell-927M9-IPS-LED-27-Inch-LED-lit-Monitor/dp/B009H0XQPA
so they just lose the margin on the display. Selling both together means you have to take Apple's display in that price range. I'd like to see them be a bit cheaper with the Cinema Displays now. You can see from the following list, a lot of companies use very similar panels even between NEC and Eizo:
http://www.tftcentral.co.uk/search.php?query=ips&select=panel
Their $999 display would be better off at $699 or $799 and they can make up some lost margin in increased volume. A more comparable Dell would be the following as the above is 1080p:
http://www.amazon.com/Dell-U2713HM-IPS-LED-CVN85-27-Inch-LED-lit/dp/B009H0XQQY
but we know Dell's margins on things are far too low so maybe the same goes for the displays.
If we assume they have 40% gross margins on the iMac, that means the $2200 model costs $1320. The $999 Cinema Display would be $599 but we know that includes the chassis and parts so the panel itself would only be a part of that. I'd say $400 is a fair estimate for the panel cost.
That would mean costs for the mid-range headless i7 machine would be $920 and to make the same margin, sold at $1533 or rounded to $1599. That's what they used to sell the G4 towers at.
The big problem there for Apple is that undercuts the entry 27". While they'd be making the same margins, it probably won't make many more sales than the equivalent iMac so Apple is losing the extra amount from the sale of the panel.
The best they could do is sell it at $1799 like the original Cube because that way there is the choice between a more powerful machine for $1799 or a slower iMac with a 27" display for $1799 and they make an even higher margin on the headless model.
I'd imagine the Mac Pro parts won't be too far off that $920 price but maybe add another $300 for the motherboard and PSU. That would make the gross margin 50%. If it was 40% like the other models, it would be closer to $2099. The cheapest they went was $2199 (~45%) so really they might have only increased their gross margin 5% putting the entry level at $2499.
I think it's a good idea for them to use the Xeon as it gives them the option to go to 12-core chips with the same motherboard. If they go with a $2499 6-core, they could maybe have an entry quad-core at the old $2199 price. It does seem like poor value next to the $2199 iMac but you'd be getting the option for more RAM, possibly a half-length PCIe 3 expansion slot or two, a replaceable GPU and easy access to storage.
Quote:
Originally Posted by Marvin
If we assume they have 40% gross margins on the iMac, that means the $2200 model costs $1320. The $999 Cinema Display would be $599 but we know that includes the chassis and parts so the panel itself would only be a part of that. I'd say $400 is a fair estimate for the panel cost.
That would mean costs for the mid-range headless i7 machine would be $920 and to make the same margin, sold at $1533 or rounded to $1599. That's what they used to sell the G4 towers at.
The problem here is beyond what you mentioned. You have a greater level of conflict with the imacs in price point, and you've isolated the dual socket version on components. The margins on that are likely huge for little extra R&D. On the windows side it's the same. The return on the dual models is much higher. Apple is always trying to reuse components, so the current strategy seems better aligned with that. You might remember they used to ship dual socket versions at $2300-2500. Unless their costs skyrocketed or margins there were pathetic, they must be extremely high now. If they wanted to placate the xMac crowd, they could have solved that problem when they redesigned the imac by making it a little less hostile to customization and using extra space to contain storage rather than thin out the edges. This would make more sense to me than trying to morph the mac pro into a consumer box that bisects the imac pricing tiers. On displays, there's a lot Apple could do there. Since you mentioned NEC and Eizo, they engineer around certain panel deficiencies whenever possible. They also seem to bin things based on performance. That is likely not feasible for Apple as they don't have the tiered markets or corporate display sales to sink cheaper units with panels that didn't make the specs necessary for their most expensive lines. Intelligent compensation for drift based on panel hours, methods of blocking to improve edge to edge uniformity (see colorcomp and due) are things that would be practical. Apple likes things that are done behind the scenes, and these two vendors are largely that way. Their software involves setting targets and pressing a couple buttons. Uniformity compensation on Eizo is handled without user input. I think NEC went this way as of 2010 as well (not 100% on that). Anyway my point being that there's a weird desire to shoehorn one thing into another due to the lack of an updated model. I don't see them homogenizing to that degree, especially when it represents one of their slower growth points.
Quote:
I think it's a good idea for them to use the Xeon as it gives them the option to go to 12-core chips with the same motherboard. If they go with a $2499 6-core, they could maybe have an entry quad-core at the old $2199 price. It does seem like poor value next to the $2199 iMac but you'd be getting the option for more RAM, possibly a half-length PCIe 3 expansion slot or two, a replaceable GPU and easy access to storage.
This seems really silly to me. If they stuck to the same margins, you'd still see a quad core at $2500. If you split it into the X79/i7 route, you'd lose the ability to reuse parts with the current daughterboard setup. Half-length makes even less sense unless a lot of what is currently used fits into that factor. It's a smaller market, so manufacturers aren't that likely to heavily refactor things. It's also not what they've used in gpus.
I'm just going to add that if the imac was better aligned for me in a few areas, I would probably go that route (as in the top one).
Once again facts are better than imagined problems. Apple does reject panels that don't meet an A grade. Cheaper brand manufacturers use their rejected A- panels:
http://www.pcper.com/reviews/Displays/Achieva-Shimian-27-Monitor-Review-1440p-IPS-Display-Under-350
"there was already a cult following for these Korean monitors on various forums throughout the internet. Quickly we found out that these monitors were using the "A-" stock of LG LM270WQ1 27" panels. While the current Dell Ultrasharp 27" U2711 displays use a different revision of the LG panel (LM270WQ2) this is in fact the same panel Apple uses in their 27" iMac and Cinema Display/Thunderbolt Display."
They would be practical but you can always buy another brand for that level of control. Apple uses high grade, calibrated panels so when you buy them, you're getting a high quality display. I'd prefer that they provided 3 year warranties on their displays but manufacturers don't make money giving customers hardware that lasts a long time. If Apple's displays go bad after a period shorter than 3-4 years, it's a real problem, otherwise it's another imagined problem.
Ultimately their aim is to maintain the net margins. They can reduce material costs, shipping costs, hit a higher volume of customers etc. They can adjust gross margins slightly and maintain net margins.
It doesn't matter because all you need is a breakout box for full length cards. Most expansion cards seem to be half-length full height.
You mean if it had dual Xeons, a top of the line Quadro with 4GB of video memory, 4 full length PCIe 3 slots and manual display calibration? They might have just a little trouble fitting that all into an attractive AIO design. Even if they did, I'm sure you'd find something wrong with it like 'thermal issues' causing severe unusable throttling or a fractional brightness shift they haven't accounted for.
Quote:
Originally Posted by hmm
Apple has never really offered that many PCI slots. You'd either end up with multiple machines, a different brand, or a different solution somewhere in there. I'm not questioning your requirements, just whether you'd be able to fulfill them that way with a Mac Pro.
My Mac IIx had 5 slots. I usually used a Magma expander.
Quote:
Originally Posted by Marvin
It doesn't matter because all you need is a breakout box for full length cards. Most expansion cards seem to be half-length full height.
You mean if it had dual Xeons, a top of the line Quadro with 4GB of video memory, 4 full length PCIe 3 slots and manual display calibration? They might have just a little trouble fitting that all into an attractive AIO design. Even if they did, I'm sure you'd find something wrong with it like 'thermal issues' causing severe unusable throttling or a fractional brightness shift they haven't accounted for.
I didn't mean anything of the sort. The hostility is truly unnecessary. I enjoy these discussions when they remain discussions and don't disregard previously established points in their entirety. I suggested that the imac could have picked up some of the "xmac" features as opposed to a desire to morph the mac pro into one. What I meant was possibly an extra storage bay. 2.5" storage seems to become better each year if 3.5" is too large. Some kind of hard drive access. We can't assume everyone will buy Applecare, and swapping out a flakey hard drive shouldn't be such a big deal. You probably recall ifixit's removal of adhesive. I think they were after a certain look and wanted to avoid seams, but that in itself is a choice. In my opinion it's one of form over function. I didn't suggest morphing the imac into a mac pro. As for the topic of vram, it's possible to get that in the typical gaming variant too. Interestingly NVidia cut back on the optimizations for computation in their gaming gpus this round, but the ram would still help. If you're locked to one oem, you do give up a certain amount of flexibility. My point here was that if some of the smaller details regarding storage and service were a big deal to them, they could have designed these things into the imac. Since you've referenced a few editing facilities, my guess is they would probably hook up a thunderbolt external boot drive in the case of a dead drive if it's time critical. That is what I would do. Sound reasonable?
Edit: I didn't mean they should use 4GB cards in the imac, although it should be mostly 2 in the 27", and 512MB is kind of ridiculous in the current year. My primary objection there is using video memory allocation to fine tune costs.
Quote:
Originally Posted by Marvin
Once again facts are better than imagined problems. Apple does reject panels that don't meet an A grade. Cheaper brand manufacturers use their rejected A- panels:
http://www.pcper.com/reviews/Displays/Achieva-Shimian-27-Monitor-Review-1440p-IPS-Display-Under-350
"there was already a cult following for these Korean monitors on various forums throughout the internet. Quickly we found out that these monitors were using the "A-" stock of LG LM270WQ1 27" panels. While the current Dell Ultrasharp 27" U2711 displays use a different revision of the LG panel (LM270WQ2) this is in fact the same panel Apple uses in their 27" iMac and Cinema Display/Thunderbolt Display."
That isn't what I meant at all. Dell and the major brands don't use A- panels. The Dell version is CCFL backlit. This was typically preferred especially as colorimeters hadn't been updated a couple years ago and LED hadn't been fully tested. Because of that, other brands were a little more conservative. I don't think it was cost related, as many of the cheaper ones adopted LED early on. The cult following is because for general use, they aren't always that bad. LG has come a long way. The reason you don't really see other panel brands now is due to shrinking margins. Hitachi doesn't even make panels these days and they pioneered IPS.
Quote:
They would be practical but you can always buy another brand for that level of control. Apple uses high grade, calibrated panels so when you buy them, you're getting a high quality display. I'd prefer that they provided 3 year warranties on their displays but manufacturers don't make money giving customers hardware that lasts a long time. If Apple's displays go bad after a period shorter than 3-4 years, it's a real problem, otherwise it's another imagined problem.
All of the companies that we've discussed including Dell do basically the same thing. They calibrate them at the factory. This might not take place in $100-300 lcds. If you're buying at that price range, it is calibrated. It doesn't tell you what the pass/fail specs are like, It doesn't tell you how many points are validated. It's a very vague description where they didn't really tell you anything. It's actually important that the hardware values are set really well there, in spite of long term drift. Part of the reason is that consumer devices don't have the same level of accuracy. It's kind of silly when people marvel at low Delta E values when they're beyond the capability of the measuring device. It's like expecting a cheap ruler to be accurate to 1/100th of an inch. Even the really expensive ones can't match factory grade equipment. You know we've been over this before. I thought certain points were understood. As far as longevity goes, from an anecdotal standpoint I haven't seen the kind of problems with recent Apple displays that I did with the older generations. If someone asks about one, I check things like Apple discussions as well as some of the reviews. When complaints come up, the severity isn't the same as some of the older generations. I haven't seen the edges turn purple or colored flashing vertical lines.
Quote:
Originally Posted by scottglasel
My Mac IIx had 5 slots. I usually used a Magma expander.
I couldn't think of anything since the early 2000s with more than 3. The Magma expander makes sense. That's obviously a solution right there, assuming it meets bandwidth requirements. Aside from the beefier power supply, I'm a little puzzled that it's much more expensive than equivalent sonnet stuff. I've never seen a review comparison.
I'd have preferred the storage to be easily accessible or at least a smaller SSD-only option. I can understand why they did it because they'd want the storage to be secured and quiet as well as only bought from them. Regular external backups should be made though so that it's not such a big deal. It would be quite hard to transplant a Fusion drive anyway because the files are split between the SSD and HDD.
If they did that you'd be in the store for days looking through the test sheets to pick the best one though. Do any other manufacturers provide calibration data? You have to judge by the expected quality assurance of any brand.
What makes a $999 27" IPS LED backit Cinema display a consumer display and a $1200 NEC PA271W a professional display? If they do the same job out of the box, you save money buying from Apple. Also, there's theoretical accuracy and practical accuracy. You can measure specs as much as you want, you're never going to match colors on a display to every single possible surface an image is reproduced on, it'll always be an approximation.
It's good to go for the best available but people get into a mindset that the best available suddenly becomes the only option and anything else is unusable or just a toy for consumers.
People have issues with all sorts of displays though, you can rarely judge by a handful of reviews:
http://forums.dpreview.com/forums/post/33687891
http://forums.dpreview.com/forums/thread/3034160
Designers, photographers and other graphics artists shouldn't be expected to be display technicians. The designers at Apple use their own displays so you'd expect they are calibrated properly. As you say, factory grade calibration is far better than user calibration.
Quote:
Originally Posted by Marvin
Do any other manufacturers provide calibration data? You have to judge by the expected quality assurance of any brand.
What makes a $999 27" IPS LED backit Cinema display a consumer display and a $1200 NEC PA271W a professional display? If they do the same job out of the box, you save money buying from Apple. Also, there's theoretical accuracy and practical accuracy. You can measure specs as much as you want, you're never going to match colors on a display to every single possible surface an image is reproduced on, it'll always be an approximation.
NEC doesn't provide such a sheet. I haven't used them regularly in a while (my primary one isn't NEC at the moment). I've worked with both at a couple shops so I have a bit of experience with both. CRTs were still better in a few ways even compared to the newest ones, but I hated the flicker and this way most of the time it's one 16:10 (similar height) for me as opposed to 2 4:5 displays with one being a bit smaller. Some Eizos do actually come with a calibration sheet showing per region how far off it was measured relative to their test target. I can probably find and scan the one from mine. All of these things are approximative at some level. Most of them on the market started designed for prepress workflow requirements, and leveraged their marketing into video, web design, and whatever else from there. My point is they're not designed as broadcast displays. Those tend to be extremely expensive even today. Anyway if we're talking about photographers, it would usually be about being able to match a print sufficiently under fixed lighting conditions. Beyond that things like uniformity, reproduction of weird colors, and how well it matches a typical gamma 2.2 at all brightness levels comes to mind. Some displays start to look muddy at lower brightness levels, clip highlight detail, crush shadows. They're bad things. Uniformity helps for photographers and designers as subtle changes can really lie to them. Obvious backlight bleed might be obvious, but subtle tends to look like part of the image and definitely throws you off if viewing two versions side by side. Personally I think use matters more than specs, as no test is really perfect.
Quote:
It's good to go for the best available but people get into a mindset that the best available suddenly becomes the only option and anything else is unusable or just a toy for consumers.
I've never suggested that. In some cases (I'd emphasize some) I'd suggest one over a quad mac pro + cinema display. It depends on the requirements. I want to test an external thunderbolt boot drive, as I feel one would be essential there as a backup boot. Obviously that adds slightly to the cost, but anyone rational totals up a complete estimate for each solution prior to forming a decision. I still abhor the fixed height due to lack of VESA on the current one. A 27" might be just tall enough to where I wouldn't want to raise it, but their design changes didn't bring any new functionality.
Quote:
People have issues with all sorts of displays though, you can rarely judge by a handful of reviews:
I had a strange string of issues with the first NEC I owned. That would be how I know they are fast about replacements. Their engineers wanted to look at the problem unit. I'd also mention the difference in pricing. Whenever I've suggested them more recently, it has always been cheaper than $1200. A couple shops have them for $950. If I suggested one, I'd link one of them, but it might mean new ones are on the way this year.
Quote:
Designers, photographers and other graphics artists shouldn't be expected to be display technicians. The designers at Apple use their own displays so you'd expect they are calibrated properly. As you say, factory grade calibration is far better than user calibration.
That's the thing. The software doesn't treat you that way. We're not talking about X-rite's i1 proof or something of that sort. These things are basically automated and come with presets. They allow you track display drift to a degree, keep the display as stable as possible, and make adjustments to hit a particular set of target values. NEC and Eizo both have hour counters built in. They both talk about some hardware level of drift compensation, yet everything drifts at some point. It's important that Apple and the rest set the values properly at the factory, yet these are still unstable devices by their nature. They do not retain the same color, but the change is gradual. If you're looking at it regularly, you become somewhat accustomed unless it's way off. As they age the changes tend to accelerate. I don't just go through technical tests. The things should be easily viewable, and that doesn't mean as bright and high in contrast as possible. With fixed 8 bit data paths that would be annoying, as the smallest change you could make on screen would still be quite large (relatively speaking). Even without adequate controls, people try to tweak these things. Colorsync for example has built in calibration utility that tweaks the out values in a basic matrix profile per channel. Whenever anyone mentions it I tell them to pretend it doesn't exist. It's the worst designed thing ever, as its default is to overwrite the default profile. Most people don't know you can just delete that to restore the old one. The other problem is that profiles break really easily, and that provides extremely coarse controls with only visual judgement.
Anyway I'd personally say it takes both for maximum usability. On the end user end, the software should be accompanied by well written documentation and hide anything that shouldn't be touched by the end user. For example I prefer how color navigator handles uniformity completely behind the scenes as opposed to the colorcomp solution of on/off and levels in advanced user mode. Supposedly they made a lot of improvements to it with the PA series, but I haven't seen a really old one. My 90s series has an insane number of hours on it. It was really good when it was newer. Now it's just too old, which is why I replaced it. I still calibrate the thing and use it to hold references and things when I'm building textures.
Seeing as you're a wealth of references, what is your favorite book on C++, ideally one where the author doesn't have an infatuation with nested code?
Edit again: By the way, I personally buy from local resellers who agree to deal with the manufacturers if the need for warranty service arises.
A USB 3 drive would do the job. A 1TB would be about $80.
I don't think books are the best way to learn that. You'd be best getting a small program that works in XCode that does something useful and modify parts and then just use a book as a backup reference when you run into trouble. As for which book, I don't know what one might help the best. It often works out quicker Googling any specific errors in compilation.
At this point it is hard to recommend C++ books as the new standard changes things enough that you have search the net for contemporary answers to questions. If somebody is looking for a C++ intro Accelerated C++ is a worthwhile investment keeping in mind the new emerging standard.
Beyond that I don't see the net as the best place to develop C++ skills. Books like the Effective C++ series are a great help in developing a budding programmers skills. The problem is not all of these accepted works have been updated to C++11. It just isn't a good time to be recommending C++ books. As it is stack overflow use to have a thread on recommended C++ texts, see -->> http://stackoverflow.com/questions/388242/the-definitive-c-book-guide-and-list
There is nothing wrong with turning to the net for information on C++, but often you get mixed messages or poor advice. The nets usefulness is tied directly to ones ability to filter the crap from the gold. When you turn to widely accepted references at least you are talking a common language or maybe better said style.
Quote:
Originally Posted by Marvin
A USB 3 drive would do the job. A 1TB would be about $80.
I don't think books are the best way to learn that. You'd be best getting a small program that works in XCode that does something useful and modify parts and then just use a book as a backup reference when you run into trouble. As for which book, I don't know what one might help the best. It often works out quicker Googling any specific errors in compilation.
Quote:
Originally Posted by wizard69
There is nothing wrong with turning to the net for information on C++, but often you get mixed messages or poor advice. The nets usefulness is tied directly to ones ability to filter the crap from the gold. When you turn to widely accepted references at least you are talking a common language or maybe better said style.
Thanks. That is a good idea. I had randomly come across one of Marvin's old threads on something C related (not C++, specifically C), which was why I tagged that in there. At the moment I'm sort of curious more than anything if a real product strategy is there. Numbers must be dismal at the moment.