# Frame rates on an LCD screen?

Posted:
edited January 2014
This has always bothered me...

A frame rate is where they flash the frames very quickly to give the apperance that something is moving, right?

kinda like how they make cartoons and movies.

But if an LCDs pixels are "always on" untell it goes off again then how can it have a frame rate?

Some one wanna explain?
«1

Posts: 6,041member
There's a response time. Apple's powerbook displays have a 16ms response time, meaning pixels get refreshed at 62hz. I'm not sure about the stand alone displays.
Posts: 22,667member
Well, if you want to get really technical about it, only film has a frame rate.

See, a CRT has a scan rate; that is how many times it can update the picture in front of you in a given second. LCDs, as the other poster mentioned, have pixel responses times, which tell you how many times in a given second a pixel is capable of changing its colour.

I don't understand what is so hard about the concepts at hand. Your vide card is still sending the data for the entire display x number of times a second, where x is the refresh rate on your display. How your display goes about handling that information, obviously, varies.

Ok, let's try this. When your video card sends the data to display your monitor, let's call that chunk of data a "frame". Let's say it is sending data to your display 60 times a second, so for all intents and purposes, this is 60fps. The LCD may not need to change the colour of all of the pixels in each of those "frames", but it doesn't mean it didn't receive the data and think about it.

It's really more an implied kind of thing I guess...
Posts: 537member
If I'm not misinformed, the human eye has a "sample frequency" of approximately 25 Hz, so a TFTs 62 fps is more than enough so fool any naked eye. Since TFTs don't scan they do not flicker either so the picture is completely stable. Even a high end CRT (like 150-200 Hz) will flicker, but they eye won't notice that as much compared to let's say a 65 Hz CRT.

AND.. 350 fps in QuakeIII is just a benchmark, there's no display I know of capable of showing every frame, and certainly no eye capable of seeing all either. BUT.. a GPU doing 350 fps won't choke on complex scenes where framerate might dip well into the scan range of a decent CRT.. and the sample frequency naked eye.
Posts: 8,123member
An LCD screen as mentionned loCash is very different from a CRT screen.

There is two differents things : the frame rate and the reponse time.

the frame rate ( currently 60 or 75 hz), is the frequencie of the screen. It means that a 60 hz screen recieved info 60 times per second : most of the time the info is the same and he has nothing to do.

The reponse time expressed in ms (the better screen reach 16 ms, and 25 ms is the minimum). In fact these time is divised in two sub unit. The reponse time is the time involved for the dot to go from the absence of color to the full density of this color. In some LCD screen this reponse time is of 40 ms, that means it can change 25 time per second of status.

A quick analysis will show a contradiction between a 75 hz frame rate and a 40 ms reponse time. In fact there isn't .

Let's imagine a system with three pixel. the frame rate is 75 hz and the reponse time 40 ms.

The first image is all white : it take 40 ms to appear . Then the second image contain a red pixel, it will take another 40 ms, but as the second pixel is starting to turn red, the third pixel is recieving an info asking him to turn in green, 12 ms later. an another 12 ms later, the red pixel did not reach his final status, but the screen is yet recieveing an another info asking the first pixel to turn blue ...

It means that in LCD screen, some pixels are in transition and others not. The frame rate is the number of info per sec the screen recieve, the time of repons the time requiered by a pixel to achieve the transition.

Posts: 1,010member
Quote:

Originally posted by Henriok

If I'm not misinformed, the human eye has a "sample frequency" of approximately 25 Hz, so a TFTs 62 fps is more than enough so fool any naked eye. Since TFTs don't scan they do not flicker either so the picture is completely stable. Even a high end CRT (like 150-200 Hz) will flicker, but they eye won't notice that as much compared to let's say a 65 Hz CRT.

AND.. 350 fps in QuakeIII is just a benchmark, there's no display I know of capable of showing every frame, and certainly no eye capable of seeing all either. BUT.. a GPU doing 350 fps won't choke on complex scenes where framerate might dip well into the scan range of a decent CRT.. and the sample frequency naked eye.

If you're an avid gamer, you'd know that there is a huge difference between 60fps and 150fps.

Say you're playing a kill-anything-that-moves type of game like Unreal Tournament. When you're 1-on-1 with an enemy, it's not so much how many frames that you can see, but more about how the higher frame rate gives you more information per second about the exact location of your foe allowing you more easily defeat that enemy.

It's truly night and day if you play on a system that gives you 60fps and then you switch to a system that doubles that.
Posts: 4,695member
Quote:

Originally posted by Powerdoc

It means that in LCD screen, some pixels are in transition and others not. The frame rate is the number of info per sec the screen recieve, the time of repons the time requiered by a pixel to achieve the transition.

Another way to say this would be that the LCD response time is the time it takes for each pixel to completely respond to the maximum change that can occur going from one frame refresh to the next. When the LCD response time is slower than the frame rate that doesn't mean that the higher frame will make no difference, only that the full dynamics of the higher frame rate might not be realized.
Posts: 4,695member
Quote:

Originally posted by Cake

It's truly night and day if you play on a system that gives you 60fps and then you switch to a system that doubles that.

Now you're bringing in another issue. In addition to LCD/CRT response characteristics and frame refresh rates, now you talking about how fast software can generate a full frame of video information.

No matter how fast the software generates frames, you cannot see any more frames than the refresh rate of your display device. If the game is running at 170 fps, and your display refresh rate is 85 Hz, you're only going to see 85 fps, or every other frame that is generated.

What ridiculously high frame rates like 300 fps gain you (besides bragging rights) is performance headroom to stay above your refresh rate when complex scenes come along and drop the game's frame rate.
Posts: 711member
Quote:

Originally posted by Henriok

If I'm not misinformed, the human eye has a "sample frequency" of approximately 25 Hz, so a TFTs 62 fps is more than enough so fool any naked eye. Since TFTs don't scan they do not flicker either so the picture is completely stable. Even a high end CRT (like 150-200 Hz) will flicker, but they eye won't notice that as much compared to let's say a 65 Hz CRT.

-Snip-

For most humans, the eye's "sample frequency" is 70 to 80Hz -- that's why the flickering on a 60Hz monitor can be so annoying. For gaming, people should not be able to see the difference in frame rates as long as they never drop below 80 fps.
Posts: 8,123member
Quote:

Originally posted by shetline

Another way to say this would be that the LCD response time is the time it takes for each pixel to completely respond to the maximum change that can occur going from one frame refresh to the next. When the LCD response time is slower than the frame rate that doesn't mean that the higher frame will make no difference, only that the full dynamics of the higher frame rate might not be realized.

You are right i was referred to the full transition.

For the frame rate : you are right most of today games in high end computers are limited by the screen. 300 fps means nothing when you are in front of your screen.

However,I think that the living experience of cake came from the fact that he speaks of an average fps. When you have an average fps of 60 you have a range of fps from let's say 15 to 100 (depending on the complexity of the scene). Therefore when you double the average fps, you also double (or perhaps more : it's non linear) the minimum number of fps.

I would say that a user experience in game will be perfect if the number of fps never drop under 60, and will be quite good if he never drop below 30.
Posts: 154member
Quote:

Originally posted by Res

For most humans, the eye's "sample frequency" is 70 to 80Hz -- that's why the flickering on a 60Hz monitor can be so annoying.

The human eye doesn't really have a "sample frequency" as such, though increased frame rates do produce a more effective simulation of motion. Each receptor in the retina continuously responds to its light input, which it transmits to the brain as a series of pulses whose rate indicates brightness. A lot of back-end processing goes on to translate that into the cognitive representation of an image, much of which is complex and poorly understood.

A monitor set at 60 Hz does not appear to flicker because of the eye's response time as such -- instead, the persistence of the monitor phosphors of a modern monitor is fairly low, so that you don't get ghosting at higher scan rates. At a low scan rate like 60 Hz, if the screen doesn't stay illuminated until the next scan, your eye, which is constantly glancing around on the monitor, can occasionally move in such a way that you focus on a dark part of the monitor for more than 1/60th of a second.

This effect is why when you stand about 100 feet away from a computer monitor it can appear to flicker badly and it looks fine up close.

If you must use a 60 Hz refresh rate, a higher-persistence monitor will help the flickering problem.

-- Mark
Posts: 5,602member
Right, and the side of your eye also has a much higher refresh rate. Look at a monitor sideways you'll see. Also I can tell the difference. LCDs suffer from something different. "Ghosting." Old non-active matrix LCDs illustrate this very well but new ones still do it. The stand up (separate) displays from Apple seem to suffer from this a lot less than laptops or not all so they must have a better refresh.
Posts: 711member
There is a nice little article on vision and fps at http://www.daniele.ch/school/30vs60/30vs60_1.html
Posts: 22,667member
A couple of years back we had a nice, long winded discussion on what the human eye is capable of perceiving. I think it won me "Longest post ever" for the year 2001

The human eye is capable of perceiving a lot more than 30fps. As everybody is unique, it's going to vary from person to person, but on average, tests put it up near 120fps if I recall correctly.

Television and film have their respective framerates because, at least in the case of film, that was the least amount of frames to create an accurate suspension of reality. Film is expensive, so using as few feet of film as possible to simulate motion was important. NTSC has a 60Hz refresh rate largely because of electrical standards in the states.

Anyhow, may want to go search for that thread if you're curious about this. I presented a lot of information that I'm not going to re-hash here. Do a search for "suspension of reality".
Posts: 154member
Quote:

Originally posted by LoCash

to create an accurate suspension of reality

That's a little bit of a distortion, if only in that film's approximation of reality is closer to that of low-fi recorded audio (in the sense that people tend to think that it "sounds the same," in some meaningful way, as the real thing) than it is to, say, reading a book, in which someone's doing something cognitively challenging to visualize a story taking place.

"suspending reality" calls to mind the reading-a-book case.

-- Mark
Posts: 913member
The human eye can see more than 60FPS.. It all has to do with motion blur and yadi yadi but thats off topic and its going to take a hell of a lot more than 60FPs for a game to "flow" like real life to the human eye.
Posts: 315member
Does anybody know the response times for the Apple displays? The Apple website only mentions brightness and contrast. Also, what about the response time of screens on the iMac, PowerBook and iBook?

EDIT 1: Oops, just found above that giant already posted that PowerBook displays have a response time of 16ms.

EDIT 2: According to http://www.formac.com/files/pdf_shee...comparison.pdf Apple's 17 inch studio displays have a response time of 40ms.
Posts: 1,861member
Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.

Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.

Don't worry, as soon as my work resetting my posts is done I'll disappear forever.
Posts: 315member
Quote:

Originally posted by AirSluf

... It is just the time delay between an individual pixel element getting an on/off command and the element attaining it's new opposite state. Changing between intermediate states takes less time....

Changing between intermediate states take less time or more time? I get conflicting answers!

I'm still interested in response time as this affects ghosting effects. Does anybody have any specs about the response time in the screens used in iBooks and iMacs?
Posts: 1,861member
Kickaha and Amorph couldn't moderate themselves out of a paper bag. Abdicate responsibility and succumb to idiocy. Two years of letting a member make personal attacks against others, then stepping aside when someone won't put up with it. Not only that but go ahead and shut down my posting priviledges but not the one making the attacks. Not even the common decency to abide by their warning (afer three days of absorbing personal attacks with no mods in sight), just shut my posting down and then say it might happen later if a certian line is crossed. Bullshit flag is flying, I won't abide by lying and coddling of liars who go off-site, create accounts differing in a single letter from my handle with the express purpose to decieve and then claim here that I did it. Everyone be warned, kim kap sol is a lying, deceitful poster.

Now I guess they should have banned me rather than just shut off posting priviledges, because kickaha and Amorph definitely aren't going to like being called to task when they thought they had it all ignored *cough* *cough* I mean under control. Just a couple o' tools.

Don't worry, as soon as my work resetting my posts is done I'll disappear forever.