R300 to be officially unvieled July 18...

2»

Comments

  • Reply 21 of 39
    Regarding tearing and perceptual limits, know tha t you are in deep, deep waters here. The human eye has at least three layers of perception. One is our conscious chroma/luminance awareness of a relatively static scene. There is a more primitive awareness of the sizes and volumes of things nearby, which happens to be accurate to the millimeter, but only on the one item a human is concentrating on. If there are peripherals of importance, the distances grow hazy. Third, there is a level of perception of certain simple shapes that are important in nature. A vertical rectangle is critical, because it's another person, an animal face-on, or a tree. A circle is an eye. This is why when you see a happy-face sticker you can't help but see it as a happy face. Trying not to will have varying amounts of temporary success, but it's a big effort, and the happy face just pops up again when you're done.



    So when you play Quake, these second and third layers are going crazy identifying enermies and patterns, while the slowest layer that we usually call sight is taxed just to be sure which room you're in and which wall of it you're facing.



    In that situation, your second layer tells you that a person is in front of you, moving fast to the left. Games are good enough now that it tells you it's a REAL person, or a picture of one. Your third layer identifies icons around the room, like Swiss army kits with a logo that was chosen because of it's strong impact on the pattern section. Your first layer gives a general nod to how colourful everything is. If, in that situation, you have time to note that seven times a second the random pixels of the stone textures are four pixels over on one line out of a thousand, then you're not playing with worthy competition.



    Games are about engaging instincts, not pleasing art critics.
  • Reply 22 of 39
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by AllenChristopher:

    <strong>Regarding tearing and perceptual limits, know tha t you are in deep, deep waters here...</strong><hr></blockquote>





    Yeah yeah, I know all of that. Rather than getting into all the theory (there's way too much), suffice it to say that at around 60 Hz most people can tell the difference between a frame rate which floats around and one which is sync'd to the display's refresh rate. A game running near the display's refresh rate looks better if it is actually sync'd to whatever that rate is. Beyond a certain point its not going to matter, but that point is higher than 60 Hz.
  • Reply 23 of 39
    stevessteves Posts: 108member
    [quote]Originally posted by Programmer:

    <strong>





    Yeah yeah, I know all of that. Rather than getting into all the theory (there's way too much), suffice it to say that at around 60 Hz most people can tell the difference between a frame rate which floats around and one which is sync'd to the display's refresh rate. A game running near the display's refresh rate looks better if it is actually sync'd to whatever that rate is. Beyond a certain point its not going to matter, but that point is higher than 60 Hz.</strong><hr></blockquote>



    I agree there is a point beyond which additional frame rate just doesn't matter. I'm not sure I agree on what that magical number is. People throw out magic numbers like 24, 30, 60, etc. as if there is some fact about it. The point is, I've seen this discussion / debate many times and in many forums. I'm not sure if you're aware of where these numbers come from. I'll give you a brief description, just in case.



    60fps - Some claim 60fps is necessary because that's what television is. The FACT of the matter is that 60hz is an arbitrary number that was used because it synced with the frequency of the US standard for electricity (60hz).



    30fps - In reality, the TV signal is interlaced, thereby only actually displaying a true 30fps.



    24fps - This is the speed of motion pictures. Actually, the same frame is "flashed" several times per second to avoid flicker, but only 24 unique frames are displayed per second.



    However, most would agree that little benefit (if any) comes from more than 30fps. Specifically, I'm talking about a steady 30fps, not an "average" of 30fps.



    Of course, this brings us to the next point. People talk about average fps in games, however, a much more important number (but not bragging right) is the "minimum" frame rate during heavy action. When it actually comes down to it, people claim they require 60+ fps rates not because they need 60fps to play the game, but because they don't want the minimum to drop below their desired minimum rate.



    Then again, it's always funny to listen to the weenies that say they need 200+ fps, even though no monitor has a refresh rate that high (thus unrealized frame rate advantage), let alone they are still using PS2 mice with significantly lower sample rates than USB mice, etc, etc...



    Like I said, this has been discussed many times in many forums. I've heard many people through out all sorts of bogus numbers without being able to explain why, etc. I do a fair amount of gaming myself. As long as I can maintain around 30fps or higher I can't notice a difference or perform any better. I have yet to have anyone else prove it's any different for them.



    Steve
  • Reply 24 of 39
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by SteveS:

    <strong>However, most would agree that little benefit (if any) comes from more than 30fps. Specifically, I'm talking about a steady 30fps, not an "average" of 30fps.</strong><hr></blockquote>



    All I can say to this is: you have obviously never seen a game with a high simulation rate driving a 30 or 60 Hz video display sync'd to the display (on a switch so that you can toggle between 30 and 60). The difference is truly remarkable, and very subliminal. Believe me, I have a fair bit of experience with this and there is a very really difference that pretty much anybody can perceive although they may not be able to articulate what the difference is.



    I agree with your other points -- although I'd add that a stable framerate is very important because you really notice when you are doing 60 Hz and then you miss a frame.



    I don't know at what point speeding up a display's refresh rate doesn't make any difference, but it is definitely higher than 60 Hz. Anybody who used one of the original Mac classics knows that the flicker of that 60.16 Hz display was most disturbing at times.
  • Reply 25 of 39
    airslufairsluf Posts: 1,861member
  • Reply 26 of 39
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by AirSluf:

    <strong>

    You can sense the subtle differences in a side-by-side test (physically or temporally) like that. But without that direct comparison you can't tell reliably which is which given a choice on a lone monitor. That's a difference between a quasi-forced choice test and a true threshold test. Which is really running into quibbling territory as far as this thread is concerned.



    As far as driving sims. Where is F1-Season 2000? Wasn't it supposed to be out this spring?</strong><hr></blockquote>



    Not true -- I can reliably identify a solid 30 vs a solid 60 fps now. And invariably people will comment that a game running at 60 plays "smoother" than one running at 30. It is a subtle effect, but it is there. I used to disbelieve this as well, but now I'm a firm believer.
  • Reply 27 of 39
    airslufairsluf Posts: 1,861member
  • Reply 28 of 39
    zosozoso Posts: 177member
    Let's give a little twist to this thread...

    Talking about current hi-end and future GPUs (I don't like the marketing term, but just to make it short...) how do they compare in terms of core complexity/performance/versatility to CPUs? I suppose the transistor count is already pretty high, as is their calculation power (and BTW, what are we talking about here? I mean, do GPUs perform vector or FP operations most of the time? I'd like to know...)

    So, just as an intellectual exercise, how would a computer perform if it had say a NV30 or R300 core in place of a "regular" CPU?

    Or, to put it in another way, are we going to face (in x years maybe) a paradoxical situation in which the difference in processing power between our CPUs and GPUs will be very small?

    Or, to give it a different spin, since current graphics technology is so advanced, wouldn't it be possible to exploit it in a wider way, going beyond the CPU/video card paradigm?



    I know it might sound very far-fetched, but hey, I never said I'm a guru, and this is Future Hardware, right?



    ZoSo
  • Reply 29 of 39
    blizaineblizaine Posts: 239member
    I can see the difference between 200fps and 199fps... But then again... I have bionic eyes. <img src="graemlins/hmmm.gif" border="0" alt="[Hmmm]" />
  • Reply 30 of 39
    toasttoast Posts: 25member
    I believe GPUs process mainly FP and possibly integer calculations most, then the AI and general game physics (which is done by the CPU) would be mostly vector. Though this is a very amateur guess based on FLOP ratings and brief knowledge of processing calculations.



    I dont think GPUs could ever replace CPUs, and visa versa. They do very different things.

    If you look at OSX and Quartz and see how much CPU usage it took up, in comparison, the newer builds of OSX should ofload most of the Graphical calculations to the GPU and i doubt the GPU would stress nearly as much doing those calcs as the CPU used to.



    hmmm, does that make sense?
  • Reply 31 of 39
    stevessteves Posts: 108member
    [quote]Originally posted by Programmer:

    <strong>



    All I can say to this is: you have obviously never seen a game with a high simulation rate driving a 30 or 60 Hz video display sync'd to the display (on a switch so that you can toggle between 30 and 60). The difference is truly remarkable, and very subliminal. Believe me, I have a fair bit of experience with this and there is a very really difference that pretty much anybody can perceive although they may not be able to articulate what the difference is.

    </strong><hr></blockquote>



    Any specific game you're referring to? I have both PCs and Macs, I've played competitively at Lan parties and tournaments, etc. I've played racing games, practially every FPS, strategy games (where frame rate is not important), flight simulators, etc.



    Again, let's not read more into what I said than is necessary. My point wasn't that 30 and 60fps were completely indistinguishable by all human beings in all cases on all hardware, etc. My point was that side by side it is VERY difficult to tell the difference (yes, I've done this with two systems). A difference exists, but it is trivial at best. Around 30fps seems to be a sweet spot whereby additional fps don't add much to the gaming experience or even the outcome of the came. For example, going from 15 to 30 fps is like going from night to day. Going from 30 to 60 fps is barely noticeable, practially to the point of insignificance to most. Yes, there are an outspoken few that insist otherwise. However, they cannot quantify this difference in improved gameplay or really even explain this difference in any meaningful way.



    [quote]<strong>

    I don't know at what point speeding up a display's refresh rate doesn't make any difference, but it is definitely higher than 60 Hz. Anybody who used one of the original Mac classics knows that the flicker of that 60.16 Hz display was most disturbing at times.</strong><hr></blockquote>



    Sometimes I wonder if you confuse a display's refresh rate with a game's frame rate. They are very different. If you're referring to flicker, etc. then you are talking about a display's refresh rate (unless you're using an LCD). Also, you're making blanket statements that don't hold true across the board. For example, the tolerable refresh rate for a monitor is dependent upon the resolution your monitor is set to. I'd agree that 60hz is the minimum tolerable refresh rate for 640x480. However, at 1024x768, it would be flicker city and cause serious headaches.



    Steve
  • Reply 32 of 39
    stevessteves Posts: 108member
    [quote]Originally posted by ZoSo:

    <strong>Let's give a little twist to this thread...

    Talking about current hi-end and future GPUs (I don't like the marketing term, but just to make it short...) how do they compare in terms of core complexity/performance/versatility to CPUs? ZoSo</strong><hr></blockquote>



    In a word, these are two very different beasts. GPUs are highly specialized and don't have a fraction of the versatility of CPUs. Having programmable pixel shaders is nothing like what a CPU is capable of. In terms of complexity, based on transistor count, I'd argue GPUs and CPUs are roughly on par. In terms of performance, how would you like to measure that? For starters, I'm not aware of a GPU that can render an image all by itself without being fed data from a CPU. I'm also not aware of a GPU that can manipulate a database, calculate a spreadsheet, browse the web, etc, etc. However, if you're referring to the subset of graphic functions which a GPU can handle, then it will blow away what a CPU can do. Of course, if it were otherwise, we'd just have multiple CPUs instead of a CPU and GPU.



    Steve
  • Reply 33 of 39
    toasttoast Posts: 25member
    The reason most people measure above 30fps is not to see if it can run smoother.



    Surely this Frame rate debate is irrelevant. Seeing as noticing any difference above 30fps is trivial, the main use of fps is to see if any more detail can be processed in a game. Generally the higher the FPS the more detail you can put in because the more processing power you have.
  • Reply 34 of 39
    aquaticaquatic Posts: 5,602member
    Absolutely..



    [quote]

    Anybody who used one of the original Mac classics knows that the flicker of that 60.16 Hz display was most disturbing at times.



    <hr></blockquote>

    :eek:
  • Reply 35 of 39
    programmerprogrammer Posts: 3,458member
    Sorry, I don't have any games that I can point to as examples of this. Most of my work is on stuff in development and this is where we play with things on the edge of perceptable limits.



    [quote]Originally posted by SteveS:

    <strong>Sometimes I wonder if you confuse a display's refresh rate with a game's frame rate. They are very different. If you're referring to flicker, etc. then you are talking about a display's refresh rate (unless you're using an LCD). Also, you're making blanket statements that don't hold true across the board. For example, the tolerable refresh rate for a monitor is dependent upon the resolution your monitor is set to. I'd agree that 60hz is the minimum tolerable refresh rate for 640x480. However, at 1024x768, it would be flicker city and cause serious headaches.

    </strong><hr></blockquote>



    I'm trying not to make blanket statements. And I do know the difference between display refresh and frame rate, but the two are related in how we perceive the world visually.



    My point is merely this -- there is a real benefit to higher frame rates which are stable and rock solid, and these go beyond just measuring machine performance. Most humans can tell the difference between a mere 30 fps and 60+ fps, but they can't quantify it... instead they will tend to prefer the high rate experience without really knowing why. Just because they can't quantify it doesn't mean that its not worthwhile. The difference between 20 and 30 is much more noticable, and most people will actually realize that it is a frame rate difference. Experiments you are doing at 41 fps aren't going to benefit from this effect -- you really need to have it sync'd to the refresh and at 60+ Hz to see it.



    As I said, I used to hold the same opinion that you do -- anything over 30 doesn't buy you anything. But after working with really high rates for a while I'm now firmly convinced of the extra mileage to be gained by high rates. Its hard to describe, but the display really just becomes "glassy smooth". :shrug:
  • Reply 36 of 39
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by Toast:

    <strong>I believe GPUs process mainly FP and possibly integer calculations most, then the AI and general game physics (which is done by the CPU) would be mostly vector. Though this is a very amateur guess based on FLOP ratings and brief knowledge of processing calculations.



    I dont think GPUs could ever replace CPUs, and visa versa. They do very different things.

    If you look at OSX and Quartz and see how much CPU usage it took up, in comparison, the newer builds of OSX should ofload most of the Graphical calculations to the GPU and i doubt the GPU would stress nearly as much doing those calcs as the CPU used to.



    hmmm, does that make sense?</strong><hr></blockquote>



    After seeing what the "next generation" (i.e. the nv30 generation) is capable of, and that the rate of advance continues, I've become of the opinion that at some point these "GPUs" will evolve into compute engines that software can use to do really expensive computational tasks. They aren't replacements for the CPU, they are a co-processor. We have a ways to go before this stuff is figured out, but it looks very promising.



    The way these things execute code is very different, but it has some big advantages to making certain kinds of calculations go really really fast. The new GPUs spend most of their time doing floating point vector calculations, and can actually store their results as floating point values now. They cannot operate as "normal" processors because of their different execution model, but personally I think there is a place for them in some non-graphics areas of computing.
  • Reply 37 of 39
    *l++*l++ Posts: 129member
    [quote]Originally posted by Programmer:

    <strong>As I said, I used to hold the same opinion that you do -- anything over 30 doesn't buy you anything. But after working with really high rates for a while I'm now firmly convinced of the extra mileage to be gained by high rates. Its hard to describe, but the display really just becomes "glassy smooth". :shrug:</strong><hr></blockquote>



    What you are experiencing does not require more than 30fps. What you get when you go above 30fps is a natural motion blur. Your eye and brain interpolate what happens between frames in a smooth fashion.



    Actually, you can see this in film at 24fps. Look at single frames of let say "A Bug's Life". Anytime there is movement, there is the appropriate motion blur, hence everything looks very smooth.



    It takes the computation of many added frames to generate adequate motion blur, that is why it is easier to just blip more frames on the screen and refresh the screen at a higher matching rate.



    Also, the fact that at higher resolutions (i.e. 1024 x 768) a 60Hz refresh rate is flickery is because of the screen phosphorus physical properties. Screens are made for ~ 720 x 486, 60Hz operations, if smaller dots are displayed, they do not get "lit" long enough by the electron beam for the phosphorus to maintain brigness for 1/60th of a second. This is why the refresh rate has to increase.
  • Reply 38 of 39
    programmerprogrammer Posts: 3,458member
    [quote]Originally posted by *l++:

    <strong>What you are experiencing does not require more than 30fps. What you get when you go above 30fps is a natural motion blur. Your eye and brain interpolate what happens between frames in a smooth fashion.



    Actually, you can see this in film at 24fps. Look at single frames of let say "A Bug's Life". Anytime there is movement, there is the appropriate motion blur, hence everything looks very smooth.</strong><hr></blockquote>



    Heh, I knew I shouldn't get into this. I know about motion blur and display persistance, etc. Go watch a movie (24 fps) and wait for a camera pan. You can reproduce that with computer graphics -- try it at 24 and 30 fps. Then try it at your display's refresh rate. Try it with and without motion blur, etc etc. Turn on and off the vertical sync. Just go and do lots of experiments, try it out on different people.



    I'm going to give up this argument at this point... not because I concede but because I can't win without an effective demonstration, which I'm not about to go through the effort to provide. I'll just leave it at this: I have seen a lot of work done on this by quite a few people on different projects and hardware, and the inescapable conclusion that I have been led to is that a solid high frame rate sync'd to the display refresh leads to a better visual experience than 30 fps. Take it or leave it, that's up to you.
  • Reply 39 of 39
    stevessteves Posts: 108member
    [quote]Originally posted by Programmer:

    <strong>My point is merely this -- there is a real benefit to higher frame rates which are stable and rock solid, and these go beyond just measuring machine performance. Most humans can tell the difference between a mere 30 fps and 60+ fps, but they can't quantify it... instead they will tend to prefer the high rate experience without really knowing why. Just because they can't quantify it doesn't mean that its not worthwhile.</strong><hr></blockquote>



    I agree that this discussion has probably gone on long enough. That said, I'd also like to point out that I don't think we are really in disagreement here. You claim there is a difference between 30 and 60fps. Yet, you also acknowledge that most people cannot quantify this difference. That was pretty much my point. There is a difference, but it's trivial at best. It does not affect one's ability to enjoy the game or perform better or worse based on the difference between these two frame rates. My point is that around 30fps is definitely a sweet spot whereby additional frame rates are not easily appreciated or even noticed by most.



    This is sort of like comparing MP3s with CDs. At some point, the bit rate in MP3s (usually 160) that's considered the sweet spot whereby most cannot distinguish between mp3 and CD. Yes, there is a subtle difference, but for all intents and purposes they are considered equivalent to most people on most audio equipment, etc. If you were to analyze the waveforms of higher bit rates, you could identify a difference and thereby prove there is a difference. However, this "difference" is wasted on the masses because it's not enough for them to care. This is the point I'm making with visual frame rates. It's not whether or not there is a scientific method of comparing 30fps vs 60fps. Rather, my point is acknowledging the sweet spot whereby additional fps is largely irrelevant.



    Steve
Sign In or Register to comment.