The problems for manufacturers, is that the turn over rate of computers will slow down. The number of unity per year in the world will decrease, and the market will slow down.
In order to survive and prosper, chip designers must do progress at any costs.
I disagree. Units are already beginning to slow, and in the end, I don't think that consumers will notice that clock speeds are not getting higher. When someone buys a machine with an AMD processor, chances are, they don't even know what it clocks at.
I think the problem is that the average consumer doesn't need a 4+ GHZ computer and the 2GHZ machine they bought 2 years ago will do them fine for the next 3.
if you take a look back at car ads, they started out saying "we have the biggest fastest car," then "our car is more efficient," then "we're safer," and now "we have style, safety, and efficiency." I think in the computer world we're closing out the phase of "we're bigger and faster" (at least in terms of processors).
if you take a look back at car ads, they started out saying "we have the biggest fastest car," then "our car is more efficient," then "we're safer," and now "we have style, safety, and efficiency." I think in the computer world we're closing out the phase of "we're bigger and faster" (at least in terms of processors).
And if we are moveing into the "we have style, safety, and efficiency" phase, we will see apple win that battle hands down lest some major changes take place at microsoft and dell.
I disagree. Units are already beginning to slow, and in the end, I don't think that consumers will notice that clock speeds are not getting higher. When someone buys a machine with an AMD processor, chances are, they don't even know what it clocks at.
I think the problem is that the average consumer doesn't need a 4+ GHZ computer and the 2GHZ machine they bought 2 years ago will do them fine for the next 3.
For home use, you are certainly right, but for industry it will hurt. I feel the need to change my macs at my office 5 years after I bougtht them, because, due to lazy programming, my medical software was damn slow (the original version was dazzling fast). Now even on a gigabit network and G5 all around it not so snappy. I hope that this software build on 4D will become more snappier when they will release (lately) the mac OS X version.
I don't think it's normal to wait between 5 and 10 seconds to acess a patient folder.
I always thought we would hit a point, for most consumers and most applications, where speed would be satisfactory and the speed bump would stop driving the industry. A majority of the PC people who upgrade now do so thinking they need more speed even though they are doing nothing more than surfing the internet with the same browser and OS they had in the beginning. For these nimrods the computer's age is what is slowing it down. Something like senility. Which in a way is true, and its caused by viruses and spyware. Maybe MS saw this coming and is counting on viruses to move new computers. Apple saw it coming and started heading in the direction of Core Image and Core Video. Change is exciting. Screw Moore and his law. Concerning the future (doing my dubya impression) "Bring it on". Apple probably has a better chance in a shaken industry than it does with the status quo.
Moore's law has been popularly interpreted differently than as originally penned.
To most non-techies, Moore's law simply gets at the fact that computers continue to get faster at an astonishing pace. Sure, we know it actually speaks to transistor count, but both are actually quite interesting phenomenon. To me, it seems the popular interpretation is the more pertinent of the two.
While transister count (and even CPU) speed has slowed a bit recently, overall system performance or capability is still improving quite rapidly. Multi-processing, Quartz extreme, faster buses, cheaper RAM, faster broadband, bigger screens, and distributed computing have all translated into tremendous task speedups for computer users despite Moore's law faultering.
Moore's law has been popularly interpreted differently than as originally penned.
To most non-techies, Moore's law simply gets at the fact that computers continue to get faster at an astonishing pace. Sure, we know it actually speaks to transistor count, but both are actually quite interesting phenomenon. To me, it seems the popular interpretation is the more pertinent of the two.
While transister count (and even CPU) speed has slowed a bit recently, overall system performance or capability is still improving quite rapidly. Multi-processing, Quartz extreme, faster buses, cheaper RAM, faster broadband, bigger screens, and distributed computing have all translated into tremendous task speedups for computer users despite Moore's law faultering.
Fascinating times...
That's why in my quote I wrote Extended Moore's law, because I was speaking of the popular interpretation of this law.
If my memory is correct, there was a second Moore's law : the cost of fab center will also increase in the same fashion.
If my memory is correct, there was a second Moore's law : the cost of fab center will also increase in the same fashion.
That's Moore's original law. Moore's law is an observation of the advancing economics of fabrication processes. He basically found a process shrink every years was the most profitable in terms of cost per integrated circuit even though costs increased over time. Moore's paper is really the paper of an economist and not so much an engineer.
That's Moore's original law. Moore's law is an observation of the advancing economics of fabrication processes. He basically found a process shrink every years was the most profitable in terms of cost per integrated circuit even though costs increased over time. Moore's paper is really the paper of an economist and not so much an engineer.
Comments
Originally posted by Powerdoc
Yes.
The problems for manufacturers, is that the turn over rate of computers will slow down. The number of unity per year in the world will decrease, and the market will slow down.
In order to survive and prosper, chip designers must do progress at any costs.
I disagree. Units are already beginning to slow, and in the end, I don't think that consumers will notice that clock speeds are not getting higher. When someone buys a machine with an AMD processor, chances are, they don't even know what it clocks at.
I think the problem is that the average consumer doesn't need a 4+ GHZ computer and the 2GHZ machine they bought 2 years ago will do them fine for the next 3.
Originally posted by ipodandimac
if you take a look back at car ads, they started out saying "we have the biggest fastest car," then "our car is more efficient," then "we're safer," and now "we have style, safety, and efficiency." I think in the computer world we're closing out the phase of "we're bigger and faster" (at least in terms of processors).
And if we are moveing into the "we have style, safety, and efficiency" phase, we will see apple win that battle hands down lest some major changes take place at microsoft and dell.
Originally posted by the cool gut
I disagree. Units are already beginning to slow, and in the end, I don't think that consumers will notice that clock speeds are not getting higher. When someone buys a machine with an AMD processor, chances are, they don't even know what it clocks at.
I think the problem is that the average consumer doesn't need a 4+ GHZ computer and the 2GHZ machine they bought 2 years ago will do them fine for the next 3.
For home use, you are certainly right, but for industry it will hurt. I feel the need to change my macs at my office 5 years after I bougtht them, because, due to lazy programming, my medical software was damn slow (the original version was dazzling fast). Now even on a gigabit network and G5 all around it not so snappy. I hope that this software build on 4D will become more snappier when they will release (lately) the mac OS X version.
I don't think it's normal to wait between 5 and 10 seconds to acess a patient folder.
To most non-techies, Moore's law simply gets at the fact that computers continue to get faster at an astonishing pace. Sure, we know it actually speaks to transistor count, but both are actually quite interesting phenomenon. To me, it seems the popular interpretation is the more pertinent of the two.
While transister count (and even CPU) speed has slowed a bit recently, overall system performance or capability is still improving quite rapidly. Multi-processing, Quartz extreme, faster buses, cheaper RAM, faster broadband, bigger screens, and distributed computing have all translated into tremendous task speedups for computer users despite Moore's law faultering.
Fascinating times...
Originally posted by dfiler
Moore's law has been popularly interpreted differently than as originally penned.
To most non-techies, Moore's law simply gets at the fact that computers continue to get faster at an astonishing pace. Sure, we know it actually speaks to transistor count, but both are actually quite interesting phenomenon. To me, it seems the popular interpretation is the more pertinent of the two.
While transister count (and even CPU) speed has slowed a bit recently, overall system performance or capability is still improving quite rapidly. Multi-processing, Quartz extreme, faster buses, cheaper RAM, faster broadband, bigger screens, and distributed computing have all translated into tremendous task speedups for computer users despite Moore's law faultering.
Fascinating times...
That's why in my quote I wrote Extended Moore's law, because I was speaking of the popular interpretation of this law.
If my memory is correct, there was a second Moore's law : the cost of fab center will also increase in the same fashion.
Originally posted by Powerdoc
If my memory is correct, there was a second Moore's law : the cost of fab center will also increase in the same fashion.
That's Moore's original law. Moore's law is an observation of the advancing economics of fabrication processes. He basically found a process shrink every years was the most profitable in terms of cost per integrated circuit even though costs increased over time. Moore's paper is really the paper of an economist and not so much an engineer.
Originally posted by Telomar
That's Moore's original law. Moore's law is an observation of the advancing economics of fabrication processes. He basically found a process shrink every years was the most profitable in terms of cost per integrated circuit even though costs increased over time. Moore's paper is really the paper of an economist and not so much an engineer.
Thanks