In more news..Transmeta is preparing to launch a 90 nm Efficeon @ 2 GHz in 2005, manufactured by Fujitsu. They already deliver a 90 nm TM8800 at 1.6 GHz so it's an increase of 25% on the current fab, and a 54% increase from its former 1.3 GHz processors at 130 nm.
So.. a run down. What gains are we seeing when moving from 130 to 90 nm.
My figures might not be entirely correct though.
Intel: 3.4 -> 3.6 = 6%
AMD: 2.4 -> 2.6 = 8%
IBM: 2 -> 2.5 = 25%
Freescale: 1.42 -> 1.5+ (1.8?) = 27%
Sun/TI: 1.2 -> 1.8 = 50%
Transmeta/Fujitsu: 1.3 -> 2 = 54%
Granted.. Freescale and TI havn't delivered anything yet, but Fujitsu have and they seem pretty optimistic. Someone asked for current evidence that the industry still sees some hope from any 90 nm fab. So far.. I've given you thee examples: Freescale, TI and Fujitsu.
In more news..Transmeta is preparing to launch a 90 nm Efficeon @ 2 GHz in 2005, manufactured by Fujitsu. They already deliver a 90 nm TM8800 at 1.6 GHz so it's an increase of 25% on the current fab, and a 54% increase from its former 1.3 GHz processors at 130 nm.
So.. a run down. What gains are we seeing when moving from 130 to 90 nm.
My figures might not be entirely correct though.
Intel: 3.4 -> 3.6 = 6%
AMD: 2.4 -> 2.6 = 8%
IBM: 2 -> 2.5 = 25%
Freescale: 1.42 -> 1.5+ (1.8?) = 27%
Sun/TI: 1.2 -> 1.8 = 50%
Transmeta/Fujitsu: 1.3 -> 2 = 54%
Granted.. Freescale and TI havn't delivered anything yet, but Fujitsu have and they seem pretty optimistic. Someone asked for current evidence that the industry still sees some hope from any 90 nm fab. So far.. I've given you thee examples: Freescale, TI and Fujitsu.
Yeah, but look at the target speeds for the big jumps ... 2 GHz or less. What this says to me is that, depending on the processor design, trouble starts in the mid-2 GHz range and gets very rapidly worse. This is not a linear problem, and there is no single magical number that is a "wall" or "barrier". Different design tradeoffs result in different speed/power characteristics, but starting in the 2.5 - 3 GHz range there is a dramatic increase in the power/frequency curve. In a sense this is good news for the slow guys because they'll catch up now, but the fast guys are already looking for a way to stay at the head of the pack.
Several years ago the CPU designers were predicting the speeds they would be at a couple of years down the road. For a long time these predictions and plans have been quite accurate. In some cases they just haven't been public knowledge. The most notable exception I can think of is Motorola and the original G4 -- that's why they were such a laughing stock. In the last year, however, these plans and predictions have fallen flat on their face and now everybody is scrambling around trying to find alternatives and solutions.
... and so Moto/Freescale looks prescient for deemphasizing Hz and working almost exclusively on power consumption for the last couple of years. All of a sudden they're sitting pretty at 1.5GHz, with room to grow.
im sorry you have to work in that kind of environment
i work in a research group and everyone around me knows what im talkign about
Um... wrong.
Otter from Argonne National Labs is a top notch automated theorem prover that has successfully demonstrated proofs for no less than three previously unproven and *unknown* relationships in group theory.
The ATP research area is starting to really take off. Can they prove *anything*? No. Are they really freaking amazing little tools that can greatly assist human theoreticians? Oh heck yes. I use Otter almost daily.
Otter from Argonne National Labs is a top notch automated theorem prover that has successfully demonstrated proofs for no less than three previously unproven and *unknown* relationships in group theory.
The ATP research area is starting to really take off. Can they prove *anything*? No. Are they really freaking amazing little tools that can greatly assist human theoreticians? Oh heck yes. I use Otter almost daily.
i am talking about the general case. of course computers can prove some theories (including trivial ones). there are some stuff that computers can't do, number theory is undecidable by reduction from halting problem(unless maybe there is hope for quantum shit to break from Turing machine model)
if you say we dont need technical expertise anymore because of ATPs i can also say we don't need english writers because there are programs that can write shit.
Saying that communication skills are unneeded or less than important in the engineering fields is ludicrous. Everyone communicates, whether it is with your teammates, your managers, your students, or your colleagues. Everyone.
Maybe it's not *your* strength, I'd hazard, but some of us find it absolutely essential to the technical processes, including research and education.
Bottom line: without strong communication skills, both spoken and written, your career and your contributions will be severely curtailed. Period.
Only a highly narrow-minded engineer would claim that communication skills are redundant or a waste in technical fields. In actuality, they are absolutely essential... otherwise, you're just wasting your time doing mental masturbation on someone else's dollar.
And just to drive the point home... ever hear of Fred Brooks? If you're at IBM, try asking around. He was the lead designer on the System/360 project that turned IBM into the powerhouse of the 60s and 70s. (They *STILL* sell the bloody thing.)
He also founded the CS dept here at UNC.
Guess what one his personal additions to the PhD requirements was? A little class called "Technical Communications", which he still personally teaches. Guess what you do? Learn to write essays, reports, whitepapers, articles, letters, grants, and also give speeches, from notetaking to presentation styles, to what to do when the projector bulb burns out halfway through. He's thorough. He's also one of the most successful engineers, as measured by contributions to the state of the art, education, and general field. Don't take my word for it though, he won the Turing Award.
Clear communication is as essential, if not more so, in the technical fields as in literary ones. There, you can fudge it and people think it's artistically deep. You don't get that leeway in technical communication.
worrying about english gets in the way of doing the real technical stuff with the most efficiency possible. there is no free lunch. energy you spend constructing your sentences can be used for your technical engineering shit.
You are so wrong. I tried it and I learned the hard way. Anyone around Febuary 2, 2004 would know.
Bottom line: without strong communication skills, both spoken and written, your career and your contributions will be severely curtailed. Period.
I can't contest enough of how true this is in ANY engineering field. Without communication engineering would cease to exist.
Nr9, if you graduated from ANY reputable school / program you would know this (if you truly are an engineer). The fact that you just said
Quote:
worrying about english gets in the way of doing the real technical stuff with the most efficiency possible. there is no free lunch. energy you spend constructing your sentences can be used for your technical engineering shit.
made me lose ALL SHRED of respect or belief in any point you may have. If you admit you aren't in the engineering field and don't have a engineer related degree I'll start paying attention to you again. That statement alone shows you have been lying.
im sorry you have to work in that kind of environment
i work in a research group and everyone around me knows what im talkign about
No but if you're deciding theories for a computer you must spell out the assumptions and processes in a report so the work can be checked. Reproducibility is a must in research, as I'm sure you'd know working in a research group As an aside though I've been there, done that too, albeit for the worst six months of my life, and I'm guessing you don't get a great deal of funding
Nobody with any engineering experience is not going to side with you on this one. People that can't communicate don't ever sell their ideas and they certainly don't move out of the lower rungs. To get decent marks in engineering universities expect you can get the technical merits right and differentiate their students in most cases on the differences of how they present their arguments and methods of working to the final solution.
Technically minded engineers really are a dime a dozen and most companies, especially IBM, look for better. As an aside I've been with IBM so I also know they value communication skills very highly. Whether it is publishing or reports they are demanding. Excuse me if I scoff a bit at your claim you're working for them
I will give you one thing, the topic of processors no longer scaling and movements in the ideas driving performance are interesting ones so I'm not sorry you posted it. Anybody who thinks you have inside knowledge is kidding themselves though. I actually really look forward to what the next year or two will bring, particularly technologies like Cell, which you must know all about being at IBM
my school is a prestigious university where a lot of the people in electrical engineering and computer science are from eastern europe, india, or far east asia and everyone has piss poor english communication skills.
there is only one technical communication course requirement out of a 4 year curriculum. anyone can fake having good english for one semester. or fake good english for an essay for an apitutde test.
design documents have very little prose and is mostly pseudocode and equations.
if you go to a top ranked graduate school and look at the computer science department, you can't understand what half of the people are saying. Most of the time its the european accent.
Go read cutting edge papers about computer science. Chances are that it is not in clear, coherent english. Hell, even read a graduate level textbook. It won't be in good english.
If communication has anything to do with IMPACT, or stirring response then Nr9 wins hands down.
As to validity, well his terse statements have gone from so wrong to so right they're trivial - to quote Telomar - "you really haven't said much here that hasn't been pretty well known for quite some time"; that's after 4 pages of "Nr9 is soo wrong...".
Excellent subject tho, & some excellent info/responses from among others Telomar, Programmer. Thanx all, i find the "wall hitting" fascinating & think that IBM's inability to meet Job's 3GHz prediction speaks volumes.
But then i've often been bemused by the vehemence of attacks on posters & of course Motorola. who are perhaps now looking slightly clever?
Can I just bring something back to square one, for a sec... We are talking about NR9's credibility, right? So let me ask of NR9 this: Tell us one small nugget that you know to be forthcoming and true. Like, for example, when the new eMac is coming out and what its specs are...when an iPod rev is coming and what its specs are...let's see you get something right. The problem with your assertions is that you haven't yet proved your an insider. If you wish to make these claims that you have insider knowledge that the Power Mac G5 has peaked at 2.5GHz and won't go any higher, you have to have a track record to run on, and right now you are the laughing stock of this board because it lacks credibility.
& of course Motorola. who are perhaps now looking slightly clever?
That is stretching it, a bit I think.
This isn't the right thread for a discussion on communications skills, so let's try to stay on topic.
One comment about Cell: one advantage of taking a (relatively) clean slate approach is that the designer can simply omit things that conventional design has demonstrated are expensive and/or limiting. This is especially true for specialized cores like the vector cores likely to be found in the Cell chip. Perhaps the main core will run at a slower speed because it has full conventional functionality, but the vector cores will be able to run much faster because they carefully avoided bottlenecks and performance-gating design features. They've already decided to break software compatiblity, so why not take maximum advantage of it. Intel can run Prescott's adder unit at double the base clock rate because it doesn't include certain elements that slow things down -- Cell could do something similar but on even a larger scale. In some sense IBM/Motorola/Apple design AltiVec with the same concept in mind: those vector instructions were heavily influenced by what could be implemented with 1 instruction/clock throughput. The x86 ISA, on the other hand, is filled with overly complex instructions so the first thing Intel and AMD do is bust them apart into pieces designed to be rapidly executable. Apparently one of these things in the Cell is cache... its vector cores won't have any. As a result of this from-scratch approach, Cell may be able to deliver clock rates well in excess of what conventional designs have achieved (much like Prescott's 7+ GHz adder unit).
The notion that the speed limit is defined in part by the design seems to be lost on some. Consider, as an analogy, the top speed of ground cars. The fastest one in existence achieved just over Mach 1. Is it useful? Could they sell it to the mass market consumer? No. For practical applications the maximum speed of virtually all automobiles is around 150-200 mph. There are lots of factors (aerodynamics, horsepower, gearing, weight, etc) that go into what a particular car's maximum speed is, but aside from the legal limit there is no hard-and-fast maximum. The fastest car isn't the "best" car for 99.9999+% of the drivers out there -- for most people other factors are far more important (gas mileage, carrying capacity, looks, cost, maintenance, reliability, brand name, comfort, etc). If your definition of "fastest" is which vehicle can deliver the largest number of people to a destination in a fixed amount of time, then your 2 seater sports coupe is not going to be the fastest. How processors (and many other things) are engineered has many parallels to this.
Power Mac will be stuck at rougly 2.5GHz for the next 3 years.
This is a known fact.
It's not a known fact, at least not here. IBM has yet to introduce any one of a number of low-k dialectrics in the 970. IBM has yet to introduce Strained Silicon Directly on Insulator. Both of which will allow significant speed increases in the current 970 architecture if they so desire.
What your saying would mean that these technologies are not and will not ever be applicable to a 0.09µm design, maybe, but I'm betting they will appear on 0.09µm and smaller.
Quote:
Originally posted by Nr9
The next power mac are 2.5GHz multicore designs.
I say the next G5 will not be multicore, but will include a low-k dialectric and will either run cooler and/or faster at the same power consumption of the current 2.5 GHz G5.
Quote:
Originally posted by Nr9
powerbook will use motorola embedded devices because they have a better future (all chip designers now are shifting from raw speed to power consumption as their main focus)
For laptops this is a safe bet, however, I'm not willing to catagorically state that the powerbook will retain a Motorola cpu. I flatly don't know.
Yes, eventually a wall will be reached, but not at 0.09µm. When it is reached hopefully other technologies will have been developed to continue increasing clock speeds.
. . . I say the next G5 will not be multicore, but will include a low-k dialectric and will either run cooler and/or faster at the same power consumption of the current 2.5 GHz G5. . .
That may be sticking your neck out, unless you have inside information. There is some evidence that IBM has a dual core 970MP in the works.
Comments
So.. a run down. What gains are we seeing when moving from 130 to 90 nm.
My figures might not be entirely correct though.
Intel: 3.4 -> 3.6 = 6%
AMD: 2.4 -> 2.6 = 8%
IBM: 2 -> 2.5 = 25%
Freescale: 1.42 -> 1.5+ (1.8?) = 27%
Sun/TI: 1.2 -> 1.8 = 50%
Transmeta/Fujitsu: 1.3 -> 2 = 54%
Granted.. Freescale and TI havn't delivered anything yet, but Fujitsu have and they seem pretty optimistic. Someone asked for current evidence that the industry still sees some hope from any 90 nm fab. So far.. I've given you thee examples: Freescale, TI and Fujitsu.
Originally posted by Henriok
In more news..Transmeta is preparing to launch a 90 nm Efficeon @ 2 GHz in 2005, manufactured by Fujitsu. They already deliver a 90 nm TM8800 at 1.6 GHz so it's an increase of 25% on the current fab, and a 54% increase from its former 1.3 GHz processors at 130 nm.
So.. a run down. What gains are we seeing when moving from 130 to 90 nm.
My figures might not be entirely correct though.
Intel: 3.4 -> 3.6 = 6%
AMD: 2.4 -> 2.6 = 8%
IBM: 2 -> 2.5 = 25%
Freescale: 1.42 -> 1.5+ (1.8?) = 27%
Sun/TI: 1.2 -> 1.8 = 50%
Transmeta/Fujitsu: 1.3 -> 2 = 54%
Granted.. Freescale and TI havn't delivered anything yet, but Fujitsu have and they seem pretty optimistic. Someone asked for current evidence that the industry still sees some hope from any 90 nm fab. So far.. I've given you thee examples: Freescale, TI and Fujitsu.
Yeah, but look at the target speeds for the big jumps ... 2 GHz or less. What this says to me is that, depending on the processor design, trouble starts in the mid-2 GHz range and gets very rapidly worse. This is not a linear problem, and there is no single magical number that is a "wall" or "barrier". Different design tradeoffs result in different speed/power characteristics, but starting in the 2.5 - 3 GHz range there is a dramatic increase in the power/frequency curve. In a sense this is good news for the slow guys because they'll catch up now, but the fast guys are already looking for a way to stay at the head of the pack.
Several years ago the CPU designers were predicting the speeds they would be at a couple of years down the road. For a long time these predictions and plans have been quite accurate. In some cases they just haven't been public knowledge. The most notable exception I can think of is Motorola and the original G4 -- that's why they were such a laughing stock. In the last year, however, these plans and predictions have fallen flat on their face and now everybody is scrambling around trying to find alternatives and solutions.
Whodathunkit?
N9r. Quit, so we can get faster Macs!!!!!
Originally posted by Nr9
computers can't decide theories
im sorry you have to work in that kind of environment
i work in a research group and everyone around me knows what im talkign about
Um... wrong.
Otter from Argonne National Labs is a top notch automated theorem prover that has successfully demonstrated proofs for no less than three previously unproven and *unknown* relationships in group theory.
The ATP research area is starting to really take off. Can they prove *anything*? No. Are they really freaking amazing little tools that can greatly assist human theoreticians? Oh heck yes. I use Otter almost daily.
Originally posted by Kickaha
Um... wrong.
Otter from Argonne National Labs is a top notch automated theorem prover that has successfully demonstrated proofs for no less than three previously unproven and *unknown* relationships in group theory.
The ATP research area is starting to really take off. Can they prove *anything*? No. Are they really freaking amazing little tools that can greatly assist human theoreticians? Oh heck yes. I use Otter almost daily.
i am talking about the general case. of course computers can prove some theories (including trivial ones). there are some stuff that computers can't do, number theory is undecidable by reduction from halting problem(unless maybe there is hope for quantum shit to break from Turing machine model)
if you say we dont need technical expertise anymore because of ATPs i can also say we don't need english writers because there are programs that can write shit.
Chill.
Saying that communication skills are unneeded or less than important in the engineering fields is ludicrous. Everyone communicates, whether it is with your teammates, your managers, your students, or your colleagues. Everyone.
Maybe it's not *your* strength, I'd hazard, but some of us find it absolutely essential to the technical processes, including research and education.
Bottom line: without strong communication skills, both spoken and written, your career and your contributions will be severely curtailed. Period.
Only a highly narrow-minded engineer would claim that communication skills are redundant or a waste in technical fields. In actuality, they are absolutely essential... otherwise, you're just wasting your time doing mental masturbation on someone else's dollar.
And just to drive the point home... ever hear of Fred Brooks? If you're at IBM, try asking around. He was the lead designer on the System/360 project that turned IBM into the powerhouse of the 60s and 70s. (They *STILL* sell the bloody thing.)
He also founded the CS dept here at UNC.
Guess what one his personal additions to the PhD requirements was? A little class called "Technical Communications", which he still personally teaches. Guess what you do? Learn to write essays, reports, whitepapers, articles, letters, grants, and also give speeches, from notetaking to presentation styles, to what to do when the projector bulb burns out halfway through. He's thorough. He's also one of the most successful engineers, as measured by contributions to the state of the art, education, and general field. Don't take my word for it though, he won the Turing Award.
Clear communication is as essential, if not more so, in the technical fields as in literary ones. There, you can fudge it and people think it's artistically deep. You don't get that leeway in technical communication.
Originally posted by Nr9
worrying about english gets in the way of doing the real technical stuff with the most efficiency possible. there is no free lunch. energy you spend constructing your sentences can be used for your technical engineering shit.
You are so wrong. I tried it and I learned the hard way. Anyone around Febuary 2, 2004 would know.
Originally posted by Kickaha
Bottom line: without strong communication skills, both spoken and written, your career and your contributions will be severely curtailed. Period.
I can't contest enough of how true this is in ANY engineering field. Without communication engineering would cease to exist.
Nr9, if you graduated from ANY reputable school / program you would know this (if you truly are an engineer). The fact that you just said
worrying about english gets in the way of doing the real technical stuff with the most efficiency possible. there is no free lunch. energy you spend constructing your sentences can be used for your technical engineering shit.
made me lose ALL SHRED of respect or belief in any point you may have. If you admit you aren't in the engineering field and don't have a engineer related degree I'll start paying attention to you again. That statement alone shows you have been lying.
Originally posted by Nr9
computers can't decide theories
im sorry you have to work in that kind of environment
i work in a research group and everyone around me knows what im talkign about
No but if you're deciding theories for a computer you must spell out the assumptions and processes in a report so the work can be checked. Reproducibility is a must in research, as I'm sure you'd know working in a research group As an aside though I've been there, done that too, albeit for the worst six months of my life, and I'm guessing you don't get a great deal of funding
Nobody with any engineering experience is not going to side with you on this one. People that can't communicate don't ever sell their ideas and they certainly don't move out of the lower rungs. To get decent marks in engineering universities expect you can get the technical merits right and differentiate their students in most cases on the differences of how they present their arguments and methods of working to the final solution.
Technically minded engineers really are a dime a dozen and most companies, especially IBM, look for better. As an aside I've been with IBM so I also know they value communication skills very highly. Whether it is publishing or reports they are demanding. Excuse me if I scoff a bit at your claim you're working for them
I will give you one thing, the topic of processors no longer scaling and movements in the ideas driving performance are interesting ones so I'm not sorry you posted it. Anybody who thinks you have inside knowledge is kidding themselves though. I actually really look forward to what the next year or two will bring, particularly technologies like Cell, which you must know all about being at IBM
my school is a prestigious university where a lot of the people in electrical engineering and computer science are from eastern europe, india, or far east asia and everyone has piss poor english communication skills.
there is only one technical communication course requirement out of a 4 year curriculum. anyone can fake having good english for one semester. or fake good english for an essay for an apitutde test.
design documents have very little prose and is mostly pseudocode and equations.
if you go to a top ranked graduate school and look at the computer science department, you can't understand what half of the people are saying. Most of the time its the european accent.
Go read cutting edge papers about computer science. Chances are that it is not in clear, coherent english. Hell, even read a graduate level textbook. It won't be in good english.
Look at : http://citeseer.ist.psu.edu/access.html
most are pretty bad english.
As to validity, well his terse statements have gone from so wrong to so right they're trivial - to quote Telomar - "you really haven't said much here that hasn't been pretty well known for quite some time"; that's after 4 pages of "Nr9 is soo wrong...".
Excellent subject tho, & some excellent info/responses from among others Telomar, Programmer. Thanx all, i find the "wall hitting" fascinating & think that IBM's inability to meet Job's 3GHz prediction speaks volumes.
But then i've often been bemused by the vehemence of attacks on posters & of course Motorola. who are perhaps now looking slightly clever?
I think he know nothing about the eMac or the iPod.
Lets talk about the "CELL"
Oh wait Mr. 007 has signed a NDA, right?
I see!
What a bummer!
Come on! I make the first step:
C = thrid letter of the alphabet
Now it's your turn
Surprise me with some inside informations!
Originally posted by kwikfx
& of course Motorola. who are perhaps now looking slightly clever?
That is stretching it, a bit I think.
This isn't the right thread for a discussion on communications skills, so let's try to stay on topic.
One comment about Cell: one advantage of taking a (relatively) clean slate approach is that the designer can simply omit things that conventional design has demonstrated are expensive and/or limiting. This is especially true for specialized cores like the vector cores likely to be found in the Cell chip. Perhaps the main core will run at a slower speed because it has full conventional functionality, but the vector cores will be able to run much faster because they carefully avoided bottlenecks and performance-gating design features. They've already decided to break software compatiblity, so why not take maximum advantage of it. Intel can run Prescott's adder unit at double the base clock rate because it doesn't include certain elements that slow things down -- Cell could do something similar but on even a larger scale. In some sense IBM/Motorola/Apple design AltiVec with the same concept in mind: those vector instructions were heavily influenced by what could be implemented with 1 instruction/clock throughput. The x86 ISA, on the other hand, is filled with overly complex instructions so the first thing Intel and AMD do is bust them apart into pieces designed to be rapidly executable. Apparently one of these things in the Cell is cache... its vector cores won't have any. As a result of this from-scratch approach, Cell may be able to deliver clock rates well in excess of what conventional designs have achieved (much like Prescott's 7+ GHz adder unit).
The notion that the speed limit is defined in part by the design seems to be lost on some. Consider, as an analogy, the top speed of ground cars. The fastest one in existence achieved just over Mach 1. Is it useful? Could they sell it to the mass market consumer? No. For practical applications the maximum speed of virtually all automobiles is around 150-200 mph. There are lots of factors (aerodynamics, horsepower, gearing, weight, etc) that go into what a particular car's maximum speed is, but aside from the legal limit there is no hard-and-fast maximum. The fastest car isn't the "best" car for 99.9999+% of the drivers out there -- for most people other factors are far more important (gas mileage, carrying capacity, looks, cost, maintenance, reliability, brand name, comfort, etc). If your definition of "fastest" is which vehicle can deliver the largest number of people to a destination in a fixed amount of time, then your 2 seater sports coupe is not going to be the fastest. How processors (and many other things) are engineered has many parallels to this.
Originally posted by Nr9
The latest news.
Power Mac will be stuck at rougly 2.5GHz for the next 3 years.
This is a known fact.
It's not a known fact, at least not here. IBM has yet to introduce any one of a number of low-k dialectrics in the 970. IBM has yet to introduce Strained Silicon Directly on Insulator. Both of which will allow significant speed increases in the current 970 architecture if they so desire.
What your saying would mean that these technologies are not and will not ever be applicable to a 0.09µm design, maybe, but I'm betting they will appear on 0.09µm and smaller.
Originally posted by Nr9
The next power mac are 2.5GHz multicore designs.
I say the next G5 will not be multicore, but will include a low-k dialectric and will either run cooler and/or faster at the same power consumption of the current 2.5 GHz G5.
Originally posted by Nr9
powerbook will use motorola embedded devices because they have a better future (all chip designers now are shifting from raw speed to power consumption as their main focus)
For laptops this is a safe bet, however, I'm not willing to catagorically state that the powerbook will retain a Motorola cpu. I flatly don't know.
Yes, eventually a wall will be reached, but not at 0.09µm. When it is reached hopefully other technologies will have been developed to continue increasing clock speeds.
Originally posted by rickag
. . . I say the next G5 will not be multicore, but will include a low-k dialectric and will either run cooler and/or faster at the same power consumption of the current 2.5 GHz G5. . .
That may be sticking your neck out, unless you have inside information. There is some evidence that IBM has a dual core 970MP in the works.