[quote] t all the technical reasons are pure mumbo jumbo, the technical eye stuff is incorrect too. <hr></blockquote>
The technical reasons are "mumbo Jumbo"? Maybe you can fill me in on what I said that was mumbo jumbo. AFter looking over my post, I see nothing inaccurate about the need for high framerates. More action=lower framerates. Average framerates do not tell the whole story.
I think it would be much more informative if along with average framerates, the min and max fps were given.
The "technical eye stuff" is NOT BS. A monitor refresh rate is measured in Hz, which is the number of times per second it is updated. At 60 Hz, most people get headaches and the displays seems to "flicker" to them. That's because the eye/brain can detect the updates. But over 70 Hz or so, the eye/brain begins to see the display as a continuous image, thus, no flicker, no headache.
There is also the argument about movies being displayed at a much lower fps, something like 35 but I can't remember for sure. However, the refresh rate analogy is more applicable. I can say for certain that anyone with even moderate gaming experience can tell the difference between 30 fps and 70 fps. However, the difference between 70 fps and 110 fps is not so easy to see (I can't see it, and I doubt anyone else could either).
When you're done beating the family back in order, please post some goobledy gook of your own about frame rates, it will be fun to pick it apart and explain why you don't have any clue what you're talking about.
The human eye can detect differences up to around 70 fps or so. That's why you can tell the difference between a monitor with a 60 Hz refresh, and one with 85 Hz refresh. One gives a headache, the other doesn't, because it's faster than the human eye can detect.
</strong><hr></blockquote>
Yes, but this is a completely different situation. The problem about slow-refresh CRTs is that they change to black between two screens, and the resulting flicker is what causes headache and stuff (note that LCDs don't flicker or cause headache, even though they usually refresh at only 60Hz).
Lower FPS in 3D games has a completely different effect: Motion gets slower and unsteady, frames get skipped, etc., but FPS only measure the update rate, i.e. the screen remains completely unchanged between two frames.
[quote]<strong>However, I think most people who buy such intense gaming rigs are the sort that don't keep the same hardware for more than a year at most...there is phallic element to all of that fps stuff as well.
</strong><hr></blockquote>
Agreed.
[quote]<strong>but they should at least make sure that they keep the Mac users who use their computers for work but also like to play games sometimes (the "casual" gamers).</strong><hr></blockquote>
Well, you said yourself that 300 FPS are not necessary for those.
The "technical eye stuff" is NOT BS. A monitor refresh rate is measured in Hz, which is the number of times per second it is updated. At 60 Hz, most people get headaches and the displays seems to "flicker" to them. That's because the eye/brain can detect the updates. But over 70 Hz or so, the eye/brain begins to see the display as a continuous image, thus, no flicker, no headache.
</strong><hr></blockquote>
It's not your description of display flicker that's flawed, but the analogy you see between CRT refresh rates and FPS in 3D games. In 3D games, lack of FPS means that movement gets chopped and sluggish. Slow CRT refresh rates are an entirely different story, because the noticeable effect here comes from the period of darkness between two screens. Note that you'll get headaches with a 60Hz screen even if there's no movement at all on it - lack of FPS only ever matters when movement is involved.
I think the ceiling on Intel's current consumer CPU design is at 10GHz or something in that area. At which point they'll reach this, and to how effective (or real) those cycle will be with regard to actual performance is unknown.
I don't think the point with the 300fps thing is wether or not the human eye can see a difference between 60fps or 300fps. I think what's important is that the P4 is capable of pushing these framerates whereas the current G4 is NOT.
Also, if I buy a computer now I would like to able to use my computer and play games coming out a year from now, not only playing games released over a year before the computer was made (don't remember when q3 came out).... Thus, if my computer runs q3@300fps, I'll probably be able to play Quake 4 at acceptable speed, instead of having to buy a new computer....
Well, dual processors PowerMac processors are really able to achieve very high fps.
The problem is that most Mac gamers don't know how to achieve very high results, contrary to PC gamers who tweaks heavily config files, and always turn off sound when benchmarking.
A popular config file for Quake 3 on the PC side is the Locki config file, which is often used to achieve very high fps. It works pretty well on the Mac side too.
A dual-G4-533 PowerMac with a Geforce 3 (under MacOS X) seems to be able to achieve...
... 267.9 fps ! (impossible to achieve under MacOS 9, it seems)
(67 fps in 1600x1200, the figure above is only valid in 640x480 and 800x600)
Not far from your 300 mark.
Just imagine a dual-G4-800 with a faster Geforce3-Ti500. Or even better, an overclocked Geforce3-Ti500. Since overclocking is popular on the PC side.
I bet upcoming PowerMacs are going to explode the 300 mark if they are G4+, and explode the 400 mark if they are G5.
Stop complaining.
Anyway, at low rez, you can see that the processor matters a lot (fps doubles if you enable the dual-processoring switch). The OS and even more the driver matter a lot too.
However, at high rez, the second processor is unused, and the bottleneck is the graphic card. Yes, even on a dual G4-533.
(well, unless I misread, maybe these figures were taken from a dual G4-800, but it doesn't matter much)
Some people also forget that Quake is an OpenGL benchmark.
And applications like Cinema 4D, LightWave and Maya are also dependant on OpenGL performance to a great extent.
So if a computer can pull 300 fps in Quake, it doesn't just mean that it can display 10 times as many frames as your eyes can read, it also means that it can probably give decent performance (maybe around 5-20 fps) on a Maya scene with a much higher polygon count. You'd be surprised at how many 3D artists routinely check out Quake benchmarks at Tom's Hardware to get an idea about the performance of the latest chips.
<strong>Well, dual processors PowerMac processors are really able to achieve very high fps.
The problem is that most Mac gamers don't know how to achieve very high results, contrary to PC gamers who tweaks heavily config files, and always turn off sound when benchmarking.
A popular config file for Quake 3 on the PC side is the Locki config file, which is often used to achieve very high fps. It works pretty well on the Mac side too.
A dual-G4-533 PowerMac with a Geforce 3 (under MacOS X) seems to be able to achieve...
... 267.9 fps ! (impossible to achieve under MacOS 9, it seems)
(67 fps in 1600x1200, the figure above is only valid in 640x480 and 800x600)
Not far from your 300 mark.
Just imagine a dual-G4-800 with a faster Geforce3-Ti500. Or even better, an overclocked Geforce3-Ti500. Since overclocking is popular on the PC side.
I bet upcoming PowerMacs are going to explode the 300 mark if they are G4+, and explode the 400 mark if they are G5.
Stop complaining.
Anyway, at low rez, you can see that the processor matters a lot (fps doubles if you enable the dual-processoring switch). The OS and even more the driver matter a lot too.
However, at high rez, the second processor is unused, and the bottleneck is the graphic card. Yes, even on a dual G4-533.
(well, unless I misread, maybe these figures were taken from a dual G4-800, but it doesn't matter much)
Bruno</strong><hr></blockquote>
Hmmm, most sites do NOT use tweaked configs for benchmarking.
Take a look at thins page (q3 settings to high quality)
<strong>Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????</strong><hr></blockquote>
This is a 25% increase, not incredible. I run my G3 350 at 450, with no change to the heatsink, a 29% increase, and it will run faster, the external L2 cache won't though.</strong><hr></blockquote>
This brings up an important point about the clockspeed range that the Pentium IV is headed into. The difference between the performance of a 3ghz processor and a 2ghz processor is the same as in absolute terms as a 1ghz processor and a 666mhz one. As you scale up, even an extra ghz doesn't gain you as much relative performance as you might think.
Additionally, the memory subsystems in computers these days are already a rather weak link. To really get that extra 1/3rd performance out of a chip -- to make it more than just a moniker to sell the machine -- you need some pretty serious memory bandwidth, which ain't cheap.
The reality is that as clockspeed rockets up, actual performance in real-world tasks doesn't match it -- and not just because of new chip architectures that follow marketing mandates to achieve higher clockspeeds in leui of per-clock computation.
If you have a frame rate that is an integer multiple of the actual screen update frequency, you avoid drawing the same exact frame more than it should be drawn. Example: 60Hz update rate and 90fps. The game draws frame one and it goes into the frame buffer and it gets displayed. Meanwhile the game is drawing frame 2 but isn't finished when it is time for update number one, so the first frame is sent to the screen again. Frame 2 finishes and frame three is begun and also finishes just in time to be used as update 2 to the monitor. End result is we dropped a computed frame in the middle and wasted those computations. The odder the multiples the weirder and potentially choppier the results, despite "better numbers."
</strong><hr></blockquote>
While this is technically true, it doesn't really matter in real life because about the first thing all those hardcore gamers do is turn the "wait for vertical sync" option off.
About 100 years of motion picture technology have determined that 24 fps is the best tradeoff for the conditions in a theater and 50 years of TV show 30fps is better when you use a CRT. </strong><hr></blockquote>
Yes, motion picture technology gives you 24 pictures/s but they're shown twice!
pic 1
black mask
pic 1
black mask
pic 2
black mask
pic 2
black mask
and so on, so that's quasi 48 fps.
TV is achieving similar results with the half picture technology.
<strong>First I have to blame AMD for starting the Mhz war. Being the first one to break the Ghz barrier. And then made Intel going crazy on catching up. And now Intel win back the Mhz/Ghz war from AMD by quite a siginificant margin.
But on the other hand. Motorola/Apple become a sorrow victim.
Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????
ps. If I am going for PC I will still go for AMD instead of that sickening Intel
[ 01-07-2002: Message edited by: Leonis ]</strong><hr></blockquote>Why would intel slow down? Do you think that they care about Apple or the Power PC? With the new 0.13 micron technology their ready to go 3 GIG and higher! AMD is also ready to jack up their chips! The result Apple gets left in the dust!!
<strong>Why would intel slow down? Do you think that they care about Apple or the Power PC? With the new 0.13 micron technology their ready to go 3 GIG and higher! The result Apple and AMD gets left in the dust!!
Comments
The technical reasons are "mumbo Jumbo"? Maybe you can fill me in on what I said that was mumbo jumbo. AFter looking over my post, I see nothing inaccurate about the need for high framerates. More action=lower framerates. Average framerates do not tell the whole story.
I think it would be much more informative if along with average framerates, the min and max fps were given.
The "technical eye stuff" is NOT BS. A monitor refresh rate is measured in Hz, which is the number of times per second it is updated. At 60 Hz, most people get headaches and the displays seems to "flicker" to them. That's because the eye/brain can detect the updates. But over 70 Hz or so, the eye/brain begins to see the display as a continuous image, thus, no flicker, no headache.
There is also the argument about movies being displayed at a much lower fps, something like 35 but I can't remember for sure. However, the refresh rate analogy is more applicable. I can say for certain that anyone with even moderate gaming experience can tell the difference between 30 fps and 70 fps. However, the difference between 70 fps and 110 fps is not so easy to see (I can't see it, and I doubt anyone else could either).
When you're done beating the family back in order, please post some goobledy gook of your own about frame rates, it will be fun to pick it apart and explain why you don't have any clue what you're talking about.
Later.
<strong>
The human eye can detect differences up to around 70 fps or so. That's why you can tell the difference between a monitor with a 60 Hz refresh, and one with 85 Hz refresh. One gives a headache, the other doesn't, because it's faster than the human eye can detect.
</strong><hr></blockquote>
Yes, but this is a completely different situation. The problem about slow-refresh CRTs is that they change to black between two screens, and the resulting flicker is what causes headache and stuff (note that LCDs don't flicker or cause headache, even though they usually refresh at only 60Hz).
Lower FPS in 3D games has a completely different effect: Motion gets slower and unsteady, frames get skipped, etc., but FPS only measure the update rate, i.e. the screen remains completely unchanged between two frames.
[quote]<strong>However, I think most people who buy such intense gaming rigs are the sort that don't keep the same hardware for more than a year at most...there is phallic element to all of that fps stuff as well.
</strong><hr></blockquote>
Agreed.
[quote]<strong>but they should at least make sure that they keep the Mac users who use their computers for work but also like to play games sometimes (the "casual" gamers).</strong><hr></blockquote>
Well, you said yourself that 300 FPS are not necessary for those.
Bye,
RazzFazz
<strong>
The "technical eye stuff" is NOT BS. A monitor refresh rate is measured in Hz, which is the number of times per second it is updated. At 60 Hz, most people get headaches and the displays seems to "flicker" to them. That's because the eye/brain can detect the updates. But over 70 Hz or so, the eye/brain begins to see the display as a continuous image, thus, no flicker, no headache.
</strong><hr></blockquote>
It's not your description of display flicker that's flawed, but the analogy you see between CRT refresh rates and FPS in 3D games. In 3D games, lack of FPS means that movement gets chopped and sluggish. Slow CRT refresh rates are an entirely different story, because the noticeable effect here comes from the period of darkness between two screens. Note that you'll get headaches with a 60Hz screen even if there's no movement at all on it - lack of FPS only ever matters when movement is involved.
Bye,
RazzFazz
Also, if I buy a computer now I would like to able to use my computer and play games coming out a year from now, not only playing games released over a year before the computer was made (don't remember when q3 came out).... Thus, if my computer runs q3@300fps, I'll probably be able to play Quake 4 at acceptable speed, instead of having to buy a new computer....
That is what matters!
The problem is that most Mac gamers don't know how to achieve very high results, contrary to PC gamers who tweaks heavily config files, and always turn off sound when benchmarking.
A popular config file for Quake 3 on the PC side is the Locki config file, which is often used to achieve very high fps. It works pretty well on the Mac side too.
Just look at the following page
<a href="http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2MXvsRadeon.html#update" target="_blank">http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2MXvsRadeon.html#update</a>
A dual-G4-533 PowerMac with a Geforce 3 (under MacOS X) seems to be able to achieve...
... 267.9 fps ! (impossible to achieve under MacOS 9, it seems)
(67 fps in 1600x1200, the figure above is only valid in 640x480 and 800x600)
Not far from your 300 mark.
Just imagine a dual-G4-800 with a faster Geforce3-Ti500. Or even better, an overclocked Geforce3-Ti500. Since overclocking is popular on the PC side.
I bet upcoming PowerMacs are going to explode the 300 mark if they are G4+, and explode the 400 mark if they are G5.
Stop complaining.
Anyway, at low rez, you can see that the processor matters a lot (fps doubles if you enable the dual-processoring switch). The OS and even more the driver matter a lot too.
However, at high rez, the second processor is unused, and the bottleneck is the graphic card. Yes, even on a dual G4-533.
(well, unless I misread, maybe these figures were taken from a dual G4-800, but it doesn't matter much)
Bruno
G5, or get IBM to design some PPC chip that absolutely does circles around anything the x86 can put out.
And applications like Cinema 4D, LightWave and Maya are also dependant on OpenGL performance to a great extent.
So if a computer can pull 300 fps in Quake, it doesn't just mean that it can display 10 times as many frames as your eyes can read, it also means that it can probably give decent performance (maybe around 5-20 fps) on a Maya scene with a much higher polygon count. You'd be surprised at how many 3D artists routinely check out Quake benchmarks at Tom's Hardware to get an idea about the performance of the latest chips.
<strong>Well, dual processors PowerMac processors are really able to achieve very high fps.
The problem is that most Mac gamers don't know how to achieve very high results, contrary to PC gamers who tweaks heavily config files, and always turn off sound when benchmarking.
A popular config file for Quake 3 on the PC side is the Locki config file, which is often used to achieve very high fps. It works pretty well on the Mac side too.
Just look at the following page
<a href="http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2MXvsRadeon.html#update" target="_blank">http://www.xlr8yourmac.com/Graphics/geforce3/GeForce3vs2 MXvsRadeon.html#update</a>
A dual-G4-533 PowerMac with a Geforce 3 (under MacOS X) seems to be able to achieve...
... 267.9 fps ! (impossible to achieve under MacOS 9, it seems)
(67 fps in 1600x1200, the figure above is only valid in 640x480 and 800x600)
Not far from your 300 mark.
Just imagine a dual-G4-800 with a faster Geforce3-Ti500. Or even better, an overclocked Geforce3-Ti500. Since overclocking is popular on the PC side.
I bet upcoming PowerMacs are going to explode the 300 mark if they are G4+, and explode the 400 mark if they are G5.
Stop complaining.
Anyway, at low rez, you can see that the processor matters a lot (fps doubles if you enable the dual-processoring switch). The OS and even more the driver matter a lot too.
However, at high rez, the second processor is unused, and the bottleneck is the graphic card. Yes, even on a dual G4-533.
(well, unless I misread, maybe these figures were taken from a dual G4-800, but it doesn't matter much)
Bruno</strong><hr></blockquote>
Hmmm, most sites do NOT use tweaked configs for benchmarking.
Take a look at thins page (q3 settings to high quality)
<a href="http://firingsquad.gamers.com/hardware/northwood/page8.asp" target="_blank">http://firingsquad.gamers.com/hardware/northwood/page8.asp</a>
The 2GHz P4 is getting 300fps@640x480 and 122fps@1600x1200
And that is with an UNTWEAKED q3 running in HQ-mode
Macs aren't even close to PCs in q3-performace..
[ 01-12-2002: Message edited by: koldolme ]</p>
<strong>Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????</strong><hr></blockquote>
They already did -- they released the Itanium
<strong>
This is a 25% increase, not incredible. I run my G3 350 at 450, with no change to the heatsink, a 29% increase, and it will run faster, the external L2 cache won't though.</strong><hr></blockquote>
This brings up an important point about the clockspeed range that the Pentium IV is headed into. The difference between the performance of a 3ghz processor and a 2ghz processor is the same as in absolute terms as a 1ghz processor and a 666mhz one. As you scale up, even an extra ghz doesn't gain you as much relative performance as you might think.
Additionally, the memory subsystems in computers these days are already a rather weak link. To really get that extra 1/3rd performance out of a chip -- to make it more than just a moniker to sell the machine -- you need some pretty serious memory bandwidth, which ain't cheap.
The reality is that as clockspeed rockets up, actual performance in real-world tasks doesn't match it -- and not just because of new chip architectures that follow marketing mandates to achieve higher clockspeeds in leui of per-clock computation.
[ 01-13-2002: Message edited by: moki ]</p>
<strong>
If you have a frame rate that is an integer multiple of the actual screen update frequency, you avoid drawing the same exact frame more than it should be drawn. Example: 60Hz update rate and 90fps. The game draws frame one and it goes into the frame buffer and it gets displayed. Meanwhile the game is drawing frame 2 but isn't finished when it is time for update number one, so the first frame is sent to the screen again. Frame 2 finishes and frame three is begun and also finishes just in time to be used as update 2 to the monitor. End result is we dropped a computed frame in the middle and wasted those computations. The odder the multiples the weirder and potentially choppier the results, despite "better numbers."
</strong><hr></blockquote>
While this is technically true, it doesn't really matter in real life because about the first thing all those hardcore gamers do is turn the "wait for vertical sync" option off.
Bye,
RazzFazz
<strong>
About 100 years of motion picture technology have determined that 24 fps is the best tradeoff for the conditions in a theater and 50 years of TV show 30fps is better when you use a CRT. </strong><hr></blockquote>
Yes, motion picture technology gives you 24 pictures/s but they're shown twice!
pic 1
black mask
pic 1
black mask
pic 2
black mask
pic 2
black mask
and so on, so that's quasi 48 fps.
TV is achieving similar results with the half picture technology.
the eye is faster than the brain
<strong>First I have to blame AMD for starting the Mhz war. Being the first one to break the Ghz barrier. And then made Intel going crazy on catching up. And now Intel win back the Mhz/Ghz war from AMD by quite a siginificant margin.
But on the other hand. Motorola/Apple become a sorrow victim.
Question is. Will Intel slow down a bit after the intro of 2.2Ghz P4. So the 'perceive' performace gap between P4/K7/G4 will not be getting widened so rapidly like now??????
ps. If I am going for PC I will still go for AMD instead of that sickening Intel
[ 01-07-2002: Message edited by: Leonis ]</strong><hr></blockquote>Why would intel slow down? Do you think that they care about Apple or the Power PC? With the new 0.13 micron technology their ready to go 3 GIG and higher! AMD is also ready to jack up their chips! The result Apple gets left in the dust!!
<img src="graemlins/smokin.gif" border="0" alt="[Chilling]" /> <img src="graemlins/smokin.gif" border="0" alt="[Chilling]" />
[ 01-16-2002: Message edited by: rbald ]</p>
<strong>Why would intel slow down? Do you think that they care about Apple or the Power PC? With the new 0.13 micron technology their ready to go 3 GIG and higher! The result Apple and AMD gets left in the dust!!
<img src="graemlins/smokin.gif" border="0" alt="[Chilling]" /> <img src="graemlins/smokin.gif" border="0" alt="[Chilling]" /> </strong><hr></blockquote>
Go masterbate to your Intel inside Bill Gates' ass poster, troll.