Originally Posted by addabox
-- The "no performance increases due to heat constraints" thing is patently absurd. Performance per watt, which is practically a base metric for CPUs, is constantly improving. Are you saying we've reached the end of the road for that improvement? Because that's obviously and foolishly wrong. I mean, so wrong it's startling.
Yea? Why no quad core laptops? Too hot.
-- Which leads you to the "no substantial software increases", which doesn't actually mean anything. Apple just released a new phone that adds video chat, HD video recording, on phone video editing, etc. Are you saying that that's it? No further advances? I don't even know how to respond to that.
Lets see you record a few hours of footage in HD, do few conversions, add a few dissolves, wipes and fades etc and bundle the whole shebang to burn to a BlueRay disk for distribution on the iPhone. Not.
Those features on the iPhone are good in name only. Heck with enough money I could get a C-64 to do the same thing, if I wanted to wait 30 years for the result.
-- You're causally dismissing Apple's stuff as "niche" devices, sales numbers would indicate otherwise.
You misinterpreted me as stating "niche" as low market share. "Niche" I mean means it's device with a set purpose and not a general purpose tool like a computer is. A Playstation 3 is a "niche" 3D gaming device for instance, despite Sony selling millions of them.
Meanwhile, back in the real world it looks like this: devices like the iPad and iPhone will continually get more powerful, adding functionality that used to require "real" computers. Such devices will be much easier to use and much more ubiquitous than desktops ever were, and it's the desktop devices that will be relegated to niche status, with only the most computationally intensive tasks requiring the big iron (just as mainframes continue to have a a role in backend number crunching). This is just the movement of technology at this time, and it's not just Apple that knows it and is beginning the transition.
Not likely, not until the processor industry solves the heat performance issue, which so far it hasn't been able to do, thus why IBM left the market. People like you are just accepting lower quality video editing on a iPhone as a substitute for the real thing.
Our entire idea of "computing" will gradually change, as persistently connected mobile devices continue to proliferate. The idea that one must drag around a large and heavy device in order to "use a computer" will seem a quaint anachronism, if it doesn't already. And this transformation will in no way compromise, reduce, debase or cripple the average person's experience of or relationship to "computing", an idea you seem to be sort of fixated on. If I wish I will still be able to purchase desktop class hardware (that will also continue to get more powerful) in order to do things like high-end video editing, scientific visuality, 3D modeling, etc. Everybody wins!
Desktops don't have the heat issues of portables. That's why one can buy 8 core MacPro's and higher core versions of 20, 40 and even 100 cores are on the way.
But try to get a quad core MacBook Pro. None to be seen. Even the new 13" MacBook Pro has last generation processors compared to the new 15" and 17" models.
Brick wall. End of the Road. Splat. Done. Dual Core A4's coming soon.