Originally posted by mooseman
I'm sure the whole point of future hardware is speculation. I might be simple and stupid, but I damn sure I know that future hardware is about speculation.
Obviously not, taking TS's rumors as gospel. I mean, that's just silly.
But, keep apologizing for Apple's decisions to under-deliver to their customers so they can maintain forced turn-over.
Provide supporting evidence for this assertion.
Its also sad that you think its OK for a thousand dollar machine to have no optical drive AND a three year old GPU. If that seems reasonable to you, you need to get out more.
blah blah blah
You think its reasonable for Apple to try and hit the "sweet spot" Fred Anderson talked abut them missing by offering a $999 machine without an optical drive and with a 3 year old GPU?
blah blah blah
You think that is reasonable? That sounds "good?" It might be reasonable in a $699 box with a CD-R/W, PCI slots, and sitting in an AGP slot. Its not reasonable in a an non-upgradeable AIO. Unless you think trashing perfectly good computers every year or two is reasonable just because you can't upgrade the GPU.
And... you're back to this non-argument. You seem to have this unsubstantiated belief that low-end terminals for labs and enterprise need to have the most kickin-ass latest graphics hardware, and anything over a year old is just oh so useless. Been to a large company lately? How about an educational institution? Machines are regularly *several* years old. There is no 'forced upgrade cycle' here, except in your own mind, I'm sorry. Secretaries and accountants do not need Quake capable GPUs. Neither do students using lab machines for general purpose work. There is a *definite* market for low-end machines with fast networking, a good CPU, a chunk of RAM, and a reasonable local hard drive. In such environments, an optical drive is a *liability*. Do you know how hard UNC had to fight to get a contract with IBM that would let them *remove* the optical drives from lab machines without voiding the warrantee? See, they were ordered en masse via a state contract, so the BTO was... non-existent. No optical drive is a *plus* in these situations... provided you have a strong infrastructure support. Which MacOS X Server provides in spades. It's a solid solution, period. The 5200 is *fine* in these environments. Hell, in 99% of the cases, it's overkill, regardless of how old it may be. In fact, the very point that a good deal of previously CPU-intensive work is being offloaded to a capable GPU means that these machines are now relatively more powerful than they would have been before... they can do more, with a lesser CPU. Previously, if you wanted to boost image editing, you simply had to boost the CPU... now, there's a choice. A reasonable CPU + reasonable GPU = low-end machine.
For this market, yes, a flat panel (reduced footprint), reasonable CPU, reasonable GPU, optical-*LESS* machine at $999 (less in bulk) with serious infrastructure support is a rather good deal, when you look at the entire package, and stop focussing on spec A or spec B to the exclusion of the larger picture.
I'm sure you'll move the focus of the argument once again, since you can't obviously respond with any logical defense. Ad homs are the last resort of the intellectually impotent.
I wasn't the one that brought up this supposition that Apple will just screw over the users with some heretofore unknown technology that will, at a stroke, apparently make their machines stop working, and into dead paperweights. That's just paranoid hand-waving, and you know it. As I stated before, please indicate with any degree of factual basis, what technology you can see coming down the pipe that would make a CoreImage/Video capable GPU suddenly useless. Come on, give it a whirl. CoreI/V was a gimme three years ago, nice to see them getting it out there... the fact that it works on a chip as low as a 5200 is a stroke of genius. It wasn't unexpected. What *was* was how low the GPU could be and still take advantage of it. Nice bonus.
So give us a clue into your crystal ball, as to what specific technology you see coming down the line, and a rough estimate as to the timeframe. See, without that, your argument regarding 'forced upgrade cycle' rather dries up. If you *have* an answer, I'd love to hear it. I mean what the hell, maybe you have years of graphics research under your belt too, and can see some neat trick that I'm missing. If not, well... just admit it, and we can end this idiotic pissing match.
In other words, please show the logic and facts behind your so far unsupported claims. You have mine, either reciprocate, or concede that your argument is lacking in substance.
See, this is what I get for peeking into a post of someone on my ignore list... learn from me kids! Don't peek! It's not worth it!