Originally Posted by sunilraman
The key to Intel's success in regaining the CPU crown is going down to 65nm and 45nm. That gave them the jump on everyone. "Hitting the wall at 90nm" was a pretty catastrophic scenario in CPU-land. IBM/Moto couldn't hack it, AMD managed to, and still do, produce some nice stuff at 90nm with decent clocks and thermal envelopes in their current range.
None of them "hacked" it. AMD least of all. AMD was so far behind everyone else in moving to 90nm ( they just finished a little while ago), that they were able to take advantage of the solutions that both IBM and Intel had found.
AMD's thermal's are pretty bad right now. up to 125 watts. Right there with the old Intel chips, and well above any of the new Core and Core 2 designs.
It was clear for a few years IBM/Freescale would not be able to pull 65nm in any reasonable amount of time to save Apple.
Both companies could have, if they wanted to. But neither did.
Aside from the Core Microarchitecture and other chip designer-y stuff, is it not that they said that the way to keep Moore's Law going is to go down to 65nm and onwards to 45nm.
Beyond 45nm, I wonder what's on the horizon. And WTF happened to the promise of optical computing? Shuffling photons around could be much cooler (literally and figuratively).
Now we're getting into some VERY interesting stuff.
Each time they more to a smaller die shrink, they are going to encounter even greater thermal problems. It's a matter of physics. Intel is working on vertical transistors, as a way of getting the same (or fairly close to) number of atoms into the gates. This is a tough road to travel. Other technologies are being worked on.
45nm will be attainable. But, after that it's a crapshoot. The best figuring at this time is that the smallest they can go with current thinking is somewhere between 32 and 20 nm. That's even with better materials and designs.
They used to think it was about 15 to 10 nm. but those thermal problems hadn't been considered. It was thought that the difficulty would be confined to being able to make masks at that size, and that would be the smallest they could go, even with the theoretical x-ray beam equipment that they had no idea how to produce.
But, now they know otherwise.
The leakage, and other problems which are even more daunting, increase geometrically, as the square (height x width) of the lines matter more than the width alone at these sizes.
There have been some major breakthroughs in optical computing. Just recently, a chip was produced in the lab that contains hundreds of lasers. this has been done before, but this is the first time it has been done on silicon. The holy grail.
But, it will take some time before this can be used for actual computing purposes, and it is mostly useful for transmitting information between chips than in doing actual computations.
for example, it will decrease the cost of bringing optical fiber the last mile, and the last hundred feet.
It will also be instrumental in allowing supercomputers to increase their performance, and in having an extermely high speed link between them.
Shades of the Forbin Project with Colossus.
[Side Geek Note] Apparently in Star Trek: Next Gen somewhere in there they talk about the computers, where imagine instead of electrons flying about you have photons or subatomic particles or something moving about, not only in real space (not fast enough), it moves in "subspace" (standard term for anything faster-than-light in the Star Trek universe).
Star Trek was cute? Wasn't it?