People's understanding of Moore's law... 5 minutes at Wikipedia would've already gave you the general idea of it.
I'm not surprised most of you people use Macs. It "just works" so no point frying your brains trying to understand something you fail to even get the basics of.
People's understanding of Moore's law... 5 minutes at Wikipedia would've already gave you the general idea of it.
I'm not surprised most of you people use Macs. It "just works" so no point frying your brains trying to understand something you fail to even get the basics of.
So how about enlightening us where we've gone wrong.
So how about enlightening us where we've gone wrong.
Check these out.
Quote:
Originally Posted by Commodification
If this was a car, would it be equal to installing two separate motors to increase performance because the law of diminishing returns makes it cost prohibitive to get a significant performance increase out of a single motor?
Quote:
Originally Posted by Commodification
If Moore's Law was still increasing at the same rate it did from 1990-2000, shouldn't we be near 16 ghz or at least 8-cores (2 ghz each)?
I often think the future of the net is some form of grid-computing, where the internet-itself becomes one giant wireless computers as all connected devices share a portion of their processing power. As America has gone from an 'ownership economy', to a 'credit economy', and is now becoming an 'access economy', I think this will significantly impact the type of computers we use and how we use them in the future.
Quote:
Originally Posted by Phong
Moore's law is more an economic law than physical or computational. It's about computational power at a given price. It may be that typical consumer-level computers are not getting better as quickly, but we're not spending as much on computers as we used to. That computing power is also being distributed, not just on multiple cores but on GPUs.
Anyway, integrated circuits aren't going to be the preferred method of computing forever. Once we reach their limit, we'll find something else. That's a long ways away, regardless.
2020 will be about the time 3D integrated circuits become standard and that will carry us into whatever new paradigm awaits in 2030 (optical or quantum computing).
But don't clump us ALL into that. 90% of people couldn't care less about how computing works?and therefore don't know how it works?and that goes for both sides, Windows and Mac.
But don't clump us ALL into that. 90% of people couldn't care less about how computing works?and therefore don't know how it works?and that goes for both sides, Windows and Mac.
Xactly! Please do include the quoted moronic posts up front so that those of us fighting the good fight against the wave of lazy thought don't think you are with ... them.
Comments
I'm not surprised most of you people use Macs. It "just works" so no point frying your brains trying to understand something you fail to even get the basics of.
People's understanding of Moore's law... 5 minutes at Wikipedia would've already gave you the general idea of it.
I'm not surprised most of you people use Macs. It "just works" so no point frying your brains trying to understand something you fail to even get the basics of.
So how about enlightening us where we've gone wrong.
So how about enlightening us where we've gone wrong.
Check these out.
If this was a car, would it be equal to installing two separate motors to increase performance because the law of diminishing returns makes it cost prohibitive to get a significant performance increase out of a single motor?
If Moore's Law was still increasing at the same rate it did from 1990-2000, shouldn't we be near 16 ghz or at least 8-cores (2 ghz each)?
I often think the future of the net is some form of grid-computing, where the internet-itself becomes one giant wireless computers as all connected devices share a portion of their processing power. As America has gone from an 'ownership economy', to a 'credit economy', and is now becoming an 'access economy', I think this will significantly impact the type of computers we use and how we use them in the future.
Moore's law is more an economic law than physical or computational. It's about computational power at a given price. It may be that typical consumer-level computers are not getting better as quickly, but we're not spending as much on computers as we used to. That computing power is also being distributed, not just on multiple cores but on GPUs.
Anyway, integrated circuits aren't going to be the preferred method of computing forever. Once we reach their limit, we'll find something else. That's a long ways away, regardless.
2020 will be about the time 3D integrated circuits become standard and that will carry us into whatever new paradigm awaits in 2030 (optical or quantum computing).
Check these out.
But don't clump us ALL into that. 90% of people couldn't care less about how computing works?and therefore don't know how it works?and that goes for both sides, Windows and Mac.
Check these out.
But don't clump us ALL into that. 90% of people couldn't care less about how computing works?and therefore don't know how it works?and that goes for both sides, Windows and Mac.
Xactly! Please do include the quoted moronic posts up front so that those of us fighting the good fight against the wave of lazy thought don't think you are with ... them