or Connect
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Will Intel ever slow down their Mhz race a bit?
New Posts  All Forums:Forum Nav:

Will Intel ever slow down their Mhz race a bit?

post #1 of 43
Thread Starter 
First I have to blame AMD for starting the Mhz war. Being the first one to break the Ghz barrier. And then made Intel going crazy on catching up. And now Intel win back the Mhz/Ghz war from AMD by quite a siginificant margin.

But on the other hand. Motorola/Apple become a sorrow victim.

Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????

ps. If I am going for PC I will still go for AMD instead of that sickening Intel

[ 01-07-2002: Message edited by: Leonis ]</p>
Mac Pro 2.66, 5GB RAM, 250+120 HD, 23" Cinema Display
MacBook 1.83GHz, 2GB RAM
Reply
Mac Pro 2.66, 5GB RAM, 250+120 HD, 23" Cinema Display
MacBook 1.83GHz, 2GB RAM
Reply
post #2 of 43
[quote]Originally posted by Leonis:
<strong>First I have to blame AMD for starting the Mhz war. Being the first one to break the Ghz barrier. And then made Intel going crazy on catching up. And now Intel win back the Mhz/Ghz war from AMD by quite a siginificant margin.

But on the other hand. Motorola/Apple become a sorrow victim.

Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????

ps. If I am going for PC I will still go for AMD instead of that sickening Intel

[ 01-07-2002: Message edited by: Leonis ]</strong><hr></blockquote>

At the rate it's going, the G4/5 will be like Jonathan's M3 in my Dodge Caravan's rear view mirror. ....a small dot....
Nov 98 - Earliest Registered User on record
Jan 02 - Earliest iPad prediction
Reply
Nov 98 - Earliest Registered User on record
Jan 02 - Earliest iPad prediction
Reply
post #3 of 43
I have this feeling that Apple is watching AMD's Quantispeed very carefully... and being good friends, if it works unlike the old PR rating disaster... then Apple may licence it.
post #4 of 43
AMD is facing some serious trouble now...the Northwood is a fantastic scaler and can beat the XP2000+ in almost all tests.

The best feature of the Northwood is its fantastic overclockingpeople can clock a 2GHz Northwood at 2.5GHz...with little temperature increase on the stock HSF.

A lot of the scalability is credited to the .13µ process on which the Northwood is based. AMD is moving from their .18µ process soon, so we should be seeing faster, .13µ Athlons by April or so.
*Registered March 1, 1999*
Member #14
Reply
*Registered March 1, 1999*
Member #14
Reply
post #5 of 43
[quote]Originally posted by MacAddict:
<strong>The best feature of the Northwood is its fantastic overclockingpeople can clock a 2GHz Northwood at 2.5GHz...with little temperature increase on the stock HSF.
</strong><hr></blockquote>

And with a liquid cooling setup people are already overclocking these to over 3GHz with ease!!!

Incredible.

And next quarter they are introducing it with a 533MHz system bus! at speeds around 2.5GHz...

Power Mac G4 has ~1.1GB/sec memory bandwidth. Pentium 4 has 3.2GB/sec now and soon 4.3GB/sec.
post #6 of 43
Some recent noises out of Intel imply that they are trying to back off the MHz focus as well. And well they should -- it prevents them from coming out with other designs at lower clock rates. Their IA-64 is a case in point. Even the P4 is increasing too fast for its memory subsystem.

Hopefully Apple's G5 will include either HyperTransport or RapidIO. This will let them ratchet up their memory speeds massively. Right now they are stuck at about 860 mb/sec (realized), whereas HT can deliver 12 gb/sec and RapidIO about 10 gb/sec. Then the bottleneck is truly memory, which can be addressed seperately from the bus bandwidth. Lets hope Epson is right, eh?
Providing grist for the rumour mill since 2001.
Reply
Providing grist for the rumour mill since 2001.
Reply
post #7 of 43
I don't see them stopping. I see them more being limited by other components. If there were more room for GHz, Intel and AMD would be fighting for it.

And, how DARE those companies try to make good products to show up their competitors! I "blame" them too... assholes..
art may imitate life, but life imitates tv.
Reply
art may imitate life, but life imitates tv.
Reply
post #8 of 43
Well, if you guys have read any of the recent reviews, it seems that with the arrival of Northwood (512K L2 cache, 13 micron process) Pentium 4s, Intel seems to be slightly getting ahead in the performance race with AMD.

What's even more important is, even though Intel hasn't officially announced them yet, Dell is already selling 2.2 Ghz Prestonia Xeons. For those of you who haven't been following, Xeon is the name Intel uses for their SMP capable workstation version of the Pentium 4, and Prestonia is the codename for the 512K/ 13 micron version of Xeon. That's not all though, the new Xeons also come with hyperthreading enabled, which can give them up to a 30 % performance boost over regular Northwood P4s according to recent benchmarks.

In store for quarter to are Xeons up to 2.5 Ghz with 533 Mhz front side bus and 1066 Mhz RDRAM.

So no, I don't think Intel is slowing down the Mhz race. In fact, not only is the P4 getting faster in Mhz it's also performing much better at the same Mhz, thanks to more cache and hyperthreading, and more improvements to the bus speed and memory subsystem.
post #9 of 43
They will eventually have to slow down. the P4 si scalable, sure, but it also ha it's limits, and once these limits are reached, Intel will have to move over to a new chip, which will most certainly also mean a new platform, ie IA64. And so far IA64 has been clocking only fairly low, and Intel has had serious trouble getting the speeds up. Also the current chips (Merced/Itanium) perform poorly compared to all otehr 64bit chips in the market, really poor.

The slower Intel advances, the more time they have to get a new platform up to speed to take over the course of the P4. Same goes for AMD.

One day, they'll find themselves having shoot themselves in the foot with the MHz race, it's already starting.

G-News
Matyoroy!
Reply
Matyoroy!
Reply
post #10 of 43
you never know, maybe intel will bust out iwth a chip that scales immensly AND has the performance of a g4, if that happens apple(and AMD for that) are dead, although I doubt that it would be intel to design such a chip, I'd sooner see it in AMD, whats RDRAM? is that like rambus? if so 1066 mhz! thats inane!(and insane)
orange you just glad?
Reply
orange you just glad?
Reply
post #11 of 43
[quote]Originally posted by MacAddict:
<strong>
The best feature of the Northwood is its fantastic overclockingpeople can clock a 2GHz Northwood at 2.5GHz...with little temperature increase on the stock HSF.
</strong><hr></blockquote>

This is a 25% increase, not incredible. I run my G3 350 at 450, with no change to the heatsink, a 29% increase, and it will run faster, the external L2 cache won't though.

Michael
Sintoo, agora non podo falar.
Reply
Sintoo, agora non podo falar.
Reply
post #12 of 43
[quote]Originally posted by Nebrie:
<strong>I have this feeling that Apple is watching AMD's Quantispeed very carefully... and being good friends, if it works unlike the old PR rating disaster... then Apple may licence it.</strong><hr></blockquote>

Apple and AMD are good friends? What makes you say this?

[quote]First I have to blame AMD for starting the Mhz war. Being the first one to break the Ghz barrier. And then made Intel going crazy on catching up. And now Intel win back the Mhz/Ghz war from AMD by quite a siginificant margin.<hr></blockquote>

The war is for performance not clockspeed. In terms of overall performance neither company is really ahead of Moore's Law curve, the curve that all processor manufacturers (save Moto) have followed for the past 25 years. Clockspeed/Performance will roughly double every 18-24 months, always has, and will continue to.

[quote]you never know, maybe intel will bust out iwth a chip that scales immensly AND has the performance of a g4, if that happens apple(and AMD for that) are dead, although I doubt that it would be intel to design such a chip, I'd sooner see it in AMD, whats RDRAM? is that like rambus? if so 1066 mhz! thats inane!(and insane) <hr></blockquote>

Yes RDRAM is Rambus, and 1066MHz is better but not fantastic. Their current ram runs at 800MHz and yet performs equivalent to 266MHz DDR memory in most real world applications. If Intel does release a chip that scales immensly and has better performance odds are that their competition like AMD will also have a new design on the burner.

[quote]This is a 25% increase, not incredible. I run my G3 350 at 450, with no change to the heatsink, a 29% increase, and it will run faster, the external L2 cache won't though.
<hr></blockquote>

Which would you rather have, 25% of $2000 or 29% of $350? Yeah that's what I thought. Statistics can be presented to say almost anything you want, but a 2500MHz P4 is much more impressive than a 450MHz G3.
post #13 of 43
About the only bad thing about the Northwoods is that they are insanely expensive. About $630 for the 2.2GHz rated Northwood, and I think around $500 for a 2GHz.

Nonetheless, the performance is great. The Northwoods get over 300fps in Q3... :eek:
*Registered March 1, 1999*
Member #14
Reply
*Registered March 1, 1999*
Member #14
Reply
post #14 of 43
[quote]Originally posted by MacAddict:
<strong>
Nonetheless, the performance is great. The Northwoods get over 300fps in Q3... :eek: </strong><hr></blockquote>

Excuse me for being naive, but doesn't the video card have more effect on fps than the CPU. That 300fps figure sounds like more a bunch of baloney.

Besides, the Intel could be 20GHz for all I care, it still only runs Windows.
post #15 of 43
[quote]Originally posted by Kevin Hayes:
<strong>

Excuse me for being naive, but doesn't the video card have more effect on fps than the CPU. That 300fps figure sounds like more a bunch of baloney.

</strong><hr></blockquote>

At low resolutions the CPU dominates the ability of a game like Quake to display FPS. At high resolutions framerate is determined by the video card. That 300FPS is actually quite real.
post #16 of 43
[quote]Originally posted by Eskimo:
<strong>

At low resolutions the CPU dominates the ability of a game like Quake to display FPS. At high resolutions framerate is determined by the video card. That 300FPS is actually quite real.</strong><hr></blockquote>

FPS that high are ridiculous. The human eye can't even tell the difference.
"It's not like Windows users don't have any power; I think they are happy with Windows, and that's an incredibly depressing thought." -Steve Jobs
Reply
"It's not like Windows users don't have any power; I think they are happy with Windows, and that's an incredibly depressing thought." -Steve Jobs
Reply
post #17 of 43
[quote]Originally posted by Eskimo:
<strong>
The war is for performance not clockspeed. In terms of overall performance neither company is really ahead of Moore's Law curve, the curve that all processor manufacturers (save Moto) have followed for the past 25 years. Clockspeed/Performance will roughly double every 18-24 months, always has, and will continue to.
</strong><hr></blockquote>

"all processor manufacturers" should really be "AMD and Intel" here, no? I mean, it's not that only Motorola can't keep up the pace, I can't see anyone keeping up with those two at all (VIA, Transmeta, Sun, ...), nobody save for Intel and AMD is shipping any CPUs that are significantly above 1GHz, or am I missing something here?

Bye,
RazzFazz
post #18 of 43
True, but then again Moore's Law is in fact by no means a law, just a rule, extracted out of an observation by a guy called Moore, who worked for Intel and had looked at the past evolution of THEIR CPU speeds and transistor counts. More's law will be broken sooner rather than later. Our current chip tech isn't the last word, we'll see Quantum chips, Asynchronous designs and such, where Moore's law will no longer be applicable without radical changes.

G-News
Matyoroy!
Reply
Matyoroy!
Reply
post #19 of 43
[quote]Originally posted by MacAddict:
<strong>Nonetheless, the performance is great. The Northwoods get over 300fps in Q3... :eek: </strong><hr></blockquote>

Already when the first P4 came out (1.5GHz?). I read a review where they tweaked q3 to ABOVE 400 fps... Try that with current PMs
post #20 of 43
[quote] FPS that high are ridiculous. The human eye can't even tell the difference.
<hr></blockquote>

&lt;sigh&gt; Some people just don't get it.

The human eye can detect differences up to around 70 fps or so. That's why you can tell the difference between a monitor with a 60 Hz refresh, and one with 85 Hz refresh. One gives a headache, the other doesn't, because it's faster than the human eye can detect.

Now, about FPS, those FPS measurements are averages. Depending on the amount of action on screen, the FPS may be much higher, or much lower. Unfortunatly, when the action is greatest is when a gamer needs the display to be smoothest, so they can aim, shoot, and run accurately. But this is when the fps drop to the lowest amount.

For example, on my PMG4, OS X quake 3 v1.31b3 gives me about 62 fps average, using demo4. But demo4 is not a very challenging demo, because there aren't many wide open spaces. So on some maps, even with NO action at all, the average frame rate drops to around 25-35, at which point the game becomes unplayable, because with action the rate drops to 10 fps or even lower.

But even on a map similar to demo4, where the average fps is still around 60, there are times when so much action is taking place that the fps drop to around 20. This drop gives gameplay a "choppy" feel, and player control becomes difficult.

I can change the game settings to get greater fps at the expense of less beautiful graphics, and almost every map is playable.

Hopefully I've illustrated the need for gaming computers to get insane average framerates, up around 300 fps, one could have all the eye candy on, and even with lots of action, gameplay would be silky smooth.

Of course, the other reason to have such high framerates would be that the computer would last longer. If you can get 300 fps average in Quake 3, then in Doom 3, you'll probably still get good performance. However, I think most people who buy such intense gaming rigs are the sort that don't keep the same hardware for more than a year at most...there is phallic element to all of that fps stuff as well.

Anyways, 60 fps average is fine for me, but I only play Quake occasionally. For someone who played it every day, they would need more power.

So Apple needs to get their butts in gear so they don't loose more sales to people who want to be able to play games. Apple will never get the hardcore gamers without a revolution, but they should at least make sure that they keep the Mac users who use their computers for work but also like to play games sometimes (the "casual" gamers).
post #21 of 43
[quote]Originally posted by RazzFazz:
<strong>

"all processor manufacturers" should really be "AMD and Intel" here, no? I mean, it's not that only Motorola can't keep up the pace, I can't see anyone keeping up with those two at all (VIA, Transmeta, Sun, ...), nobody save for Intel and AMD is shipping any CPUs that are significantly above 1GHz, or am I missing something here?

Bye,
RazzFazz</strong><hr></blockquote>

Notice I said clockspeed/performance. Sun, IBM, and others have been able to scale their performance quite well in the past and will continue to in the future.

[quote]True, but then again Moore's Law is in fact by no means a law, just a rule, extracted out of an observation by a guy called Moore, who worked for Intel and had looked at the past evolution of THEIR CPU speeds and transistor counts. More's law will be broken sooner rather than later. Our current chip tech isn't the last word, we'll see Quantum chips, Asynchronous designs and such, where Moore's law will no longer be applicable without radical changes.<hr></blockquote>

Actually the clockspeed/performance bit is a corollary to Moore's original law that transistor density would double every 18-24 months. But there is something to be said for a trendline that the entire industry has follwed for basically the entirety of its existance. Perhaps in the future the technologies you mention will take on a greater role, but silicon technology will be with us for some years to come yet. People have been harkening the demise of silicon based circuits for nearly as long as the industry has existed.
post #22 of 43
post #23 of 43
[quote] t all the technical reasons are pure mumbo jumbo, the technical eye stuff is incorrect too. <hr></blockquote>

The technical reasons are "mumbo Jumbo"? Maybe you can fill me in on what I said that was mumbo jumbo. AFter looking over my post, I see nothing inaccurate about the need for high framerates. More action=lower framerates. Average framerates do not tell the whole story.

I think it would be much more informative if along with average framerates, the min and max fps were given.

The "technical eye stuff" is NOT BS. A monitor refresh rate is measured in Hz, which is the number of times per second it is updated. At 60 Hz, most people get headaches and the displays seems to "flicker" to them. That's because the eye/brain can detect the updates. But over 70 Hz or so, the eye/brain begins to see the display as a continuous image, thus, no flicker, no headache.

There is also the argument about movies being displayed at a much lower fps, something like 35 but I can't remember for sure. However, the refresh rate analogy is more applicable. I can say for certain that anyone with even moderate gaming experience can tell the difference between 30 fps and 70 fps. However, the difference between 70 fps and 110 fps is not so easy to see (I can't see it, and I doubt anyone else could either).

When you're done beating the family back in order, please post some goobledy gook of your own about frame rates, it will be fun to pick it apart and explain why you don't have any clue what you're talking about.

Later.
post #24 of 43
[quote]Originally posted by Junkyard Dawg:
<strong>
The human eye can detect differences up to around 70 fps or so. That's why you can tell the difference between a monitor with a 60 Hz refresh, and one with 85 Hz refresh. One gives a headache, the other doesn't, because it's faster than the human eye can detect.
</strong><hr></blockquote>

Yes, but this is a completely different situation. The problem about slow-refresh CRTs is that they change to black between two screens, and the resulting flicker is what causes headache and stuff (note that LCDs don't flicker or cause headache, even though they usually refresh at only 60Hz).
Lower FPS in 3D games has a completely different effect: Motion gets slower and unsteady, frames get skipped, etc., but FPS only measure the update rate, i.e. the screen remains completely unchanged between two frames.


[quote]<strong>However, I think most people who buy such intense gaming rigs are the sort that don't keep the same hardware for more than a year at most...there is phallic element to all of that fps stuff as well.
</strong><hr></blockquote>

Agreed.


[quote]<strong>but they should at least make sure that they keep the Mac users who use their computers for work but also like to play games sometimes (the "casual" gamers).</strong><hr></blockquote>

Well, you said yourself that 300 FPS are not necessary for those.

Bye,
RazzFazz
post #25 of 43
[quote]Originally posted by Junkyard Dawg:
<strong>
The "technical eye stuff" is NOT BS. A monitor refresh rate is measured in Hz, which is the number of times per second it is updated. At 60 Hz, most people get headaches and the displays seems to "flicker" to them. That's because the eye/brain can detect the updates. But over 70 Hz or so, the eye/brain begins to see the display as a continuous image, thus, no flicker, no headache.
</strong><hr></blockquote>

It's not your description of display flicker that's flawed, but the analogy you see between CRT refresh rates and FPS in 3D games. In 3D games, lack of FPS means that movement gets chopped and sluggish. Slow CRT refresh rates are an entirely different story, because the noticeable effect here comes from the period of darkness between two screens. Note that you'll get headaches with a 60Hz screen even if there's no movement at all on it - lack of FPS only ever matters when movement is involved.

Bye,
RazzFazz
post #26 of 43
I think the ceiling on Intel's current consumer CPU design is at 10GHz or something in that area. At which point they'll reach this, and to how effective (or real) those cycle will be with regard to actual performance is unknown.
post #27 of 43
post #28 of 43
I don't think the point with the 300fps thing is wether or not the human eye can see a difference between 60fps or 300fps. I think what's important is that the P4 is capable of pushing these framerates whereas the current G4 is NOT.

Also, if I buy a computer now I would like to able to use my computer and play games coming out a year from now, not only playing games released over a year before the computer was made (don't remember when q3 came out).... Thus, if my computer runs q3@300fps, I'll probably be able to play Quake 4 at acceptable speed, instead of having to buy a new computer....

That is what matters!
post #29 of 43
Well, dual processors PowerMac processors are really able to achieve very high fps.

The problem is that most Mac gamers don't know how to achieve very high results, contrary to PC gamers who tweaks heavily config files, and always turn off sound when benchmarking.

A popular config file for Quake 3 on the PC side is the Locki config file, which is often used to achieve very high fps. It works pretty well on the Mac side too.

Just look at the following page

<a href="http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2MXvsRadeon.html#update" target="_blank">http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2MXvsRadeon.html#update</a>

A dual-G4-533 PowerMac with a Geforce 3 (under MacOS X) seems to be able to achieve...
... 267.9 fps ! (impossible to achieve under MacOS 9, it seems)
(67 fps in 1600x1200, the figure above is only valid in 640x480 and 800x600)

Not far from your 300 mark.

Just imagine a dual-G4-800 with a faster Geforce3-Ti500. Or even better, an overclocked Geforce3-Ti500. Since overclocking is popular on the PC side.

I bet upcoming PowerMacs are going to explode the 300 mark if they are G4+, and explode the 400 mark if they are G5.

Stop complaining.

Anyway, at low rez, you can see that the processor matters a lot (fps doubles if you enable the dual-processoring switch). The OS and even more the driver matter a lot too.
However, at high rez, the second processor is unused, and the bottleneck is the graphic card. Yes, even on a dual G4-533.

(well, unless I misread, maybe these figures were taken from a dual G4-800, but it doesn't matter much)

Bruno
post #30 of 43
post #31 of 43
Intel and AMD ain't slowin down buddy. That's why Macs need to get faster.

G5, or get IBM to design some PPC chip that absolutely does circles around anything the x86 can put out.
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
~Winner of the Official 2003 AppleInsider NCAA Men's Basketball Tournament Pool~
Reply
post #32 of 43
Some people also forget that Quake is an OpenGL benchmark.

And applications like Cinema 4D, LightWave and Maya are also dependant on OpenGL performance to a great extent.

So if a computer can pull 300 fps in Quake, it doesn't just mean that it can display 10 times as many frames as your eyes can read, it also means that it can probably give decent performance (maybe around 5-20 fps) on a Maya scene with a much higher polygon count. You'd be surprised at how many 3D artists routinely check out Quake benchmarks at Tom's Hardware to get an idea about the performance of the latest chips.
post #33 of 43
[quote]Originally posted by brunobl:
<strong>Well, dual processors PowerMac processors are really able to achieve very high fps.

The problem is that most Mac gamers don't know how to achieve very high results, contrary to PC gamers who tweaks heavily config files, and always turn off sound when benchmarking.

A popular config file for Quake 3 on the PC side is the Locki config file, which is often used to achieve very high fps. It works pretty well on the Mac side too.

Just look at the following page

<a href="http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2MXvsRadeon.html#update" target="_blank">http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2 MXvsRadeon.html#update</a>

A dual-G4-533 PowerMac with a Geforce 3 (under MacOS X) seems to be able to achieve...
... 267.9 fps ! (impossible to achieve under MacOS 9, it seems)
(67 fps in 1600x1200, the figure above is only valid in 640x480 and 800x600)

Not far from your 300 mark.

Just imagine a dual-G4-800 with a faster Geforce3-Ti500. Or even better, an overclocked Geforce3-Ti500. Since overclocking is popular on the PC side.

I bet upcoming PowerMacs are going to explode the 300 mark if they are G4+, and explode the 400 mark if they are G5.

Stop complaining.

Anyway, at low rez, you can see that the processor matters a lot (fps doubles if you enable the dual-processoring switch). The OS and even more the driver matter a lot too.
However, at high rez, the second processor is unused, and the bottleneck is the graphic card. Yes, even on a dual G4-533.

(well, unless I misread, maybe these figures were taken from a dual G4-800, but it doesn't matter much)

Bruno</strong><hr></blockquote>
Hmmm, most sites do NOT use tweaked configs for benchmarking.
Take a look at thins page (q3 settings to high quality)

<a href="http://firingsquad.gamers.com/hardware/northwood/page8.asp" target="_blank">http://firingsquad.gamers.com/hardware/northwood/page8.asp</a>

The 2GHz P4 is getting 300fps@640x480 and 122fps@1600x1200
And that is with an UNTWEAKED q3 running in HQ-mode

Macs aren't even close to PCs in q3-performace..

[ 01-12-2002: Message edited by: koldolme ]</p>
post #34 of 43
[quote]Originally posted by Leonis:
<strong>Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????</strong><hr></blockquote>

They already did -- they released the Itanium
Andrew Welch / el Presidente / Ambrosia Software, Inc.
Carpe Aqua -- Snapz Pro X 2.0.2 for OS X..... Your digital recording device -- WireTap Pro 1.1.0 for OS X
Reply
Andrew Welch / el Presidente / Ambrosia Software, Inc.
Carpe Aqua -- Snapz Pro X 2.0.2 for OS X..... Your digital recording device -- WireTap Pro 1.1.0 for OS X
Reply
post #35 of 43
[quote]Originally posted by mmicist:
<strong>

This is a 25% increase, not incredible. I run my G3 350 at 450, with no change to the heatsink, a 29% increase, and it will run faster, the external L2 cache won't though.</strong><hr></blockquote>

This brings up an important point about the clockspeed range that the Pentium IV is headed into. The difference between the performance of a 3ghz processor and a 2ghz processor is the same as in absolute terms as a 1ghz processor and a 666mhz one. As you scale up, even an extra ghz doesn't gain you as much relative performance as you might think.

Additionally, the memory subsystems in computers these days are already a rather weak link. To really get that extra 1/3rd performance out of a chip -- to make it more than just a moniker to sell the machine -- you need some pretty serious memory bandwidth, which ain't cheap.

The reality is that as clockspeed rockets up, actual performance in real-world tasks doesn't match it -- and not just because of new chip architectures that follow marketing mandates to achieve higher clockspeeds in leui of per-clock computation.

[ 01-13-2002: Message edited by: moki ]</p>
Andrew Welch / el Presidente / Ambrosia Software, Inc.
Carpe Aqua -- Snapz Pro X 2.0.2 for OS X..... Your digital recording device -- WireTap Pro 1.1.0 for OS X
Reply
Andrew Welch / el Presidente / Ambrosia Software, Inc.
Carpe Aqua -- Snapz Pro X 2.0.2 for OS X..... Your digital recording device -- WireTap Pro 1.1.0 for OS X
Reply
post #36 of 43
[quote]Originally posted by AirSluf:
<strong>
If you have a frame rate that is an integer multiple of the actual screen update frequency, you avoid drawing the same exact frame more than it should be drawn. Example: 60Hz update rate and 90fps. The game draws frame one and it goes into the frame buffer and it gets displayed. Meanwhile the game is drawing frame 2 but isn't finished when it is time for update number one, so the first frame is sent to the screen again. Frame 2 finishes and frame three is begun and also finishes just in time to be used as update 2 to the monitor. End result is we dropped a computed frame in the middle and wasted those computations. The odder the multiples the weirder and potentially choppier the results, despite "better numbers."
</strong><hr></blockquote>

While this is technically true, it doesn't really matter in real life because about the first thing all those hardcore gamers do is turn the "wait for vertical sync" option off.

Bye,
RazzFazz
post #37 of 43
[quote]Originally posted by AirSluf:
<strong>
About 100 years of motion picture technology have determined that 24 fps is the best tradeoff for the conditions in a theater and 50 years of TV show 30fps is better when you use a CRT. </strong><hr></blockquote>

Yes, motion picture technology gives you 24 pictures/s but they're shown twice!
pic 1
black mask
pic 1
black mask
pic 2
black mask
pic 2
black mask
and so on, so that's quasi 48 fps.
TV is achieving similar results with the half picture technology.
the eye is faster than the brain
post #38 of 43
post #39 of 43
post #40 of 43
[quote]Originally posted by Leonis:
<strong>First I have to blame AMD for starting the Mhz war. Being the first one to break the Ghz barrier. And then made Intel going crazy on catching up. And now Intel win back the Mhz/Ghz war from AMD by quite a siginificant margin.

But on the other hand. Motorola/Apple become a sorrow victim.

Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????

ps. If I am going for PC I will still go for AMD instead of that sickening Intel

[ 01-07-2002: Message edited by: Leonis ]</strong><hr></blockquote>Why would intel slow down? Do you think that they care about Apple or the Power PC? With the new 0.13 micron technology their ready to go 3 GIG and higher! AMD is also ready to jack up their chips! The result Apple gets left in the dust!!

<img src="graemlins/smokin.gif" border="0" alt="[Chilling]" /> <img src="graemlins/smokin.gif" border="0" alt="[Chilling]" />

[ 01-16-2002: Message edited by: rbald ]</p>
New Posts  All Forums:Forum Nav:
  Return Home
  Back to Forum: Future Apple Hardware
AppleInsider › Forums › Mac Hardware › Future Apple Hardware › Will Intel ever slow down their Mhz race a bit?