Originally Posted by Mac-sochist
On the subject of framerates: I remember reading an article—sorry I can't link to it, but this was in the Dead Tree Era™—about an experiment Stephen Spielberg conducted. Of course everybody knows 24 fps is inadequate for rapid movement, so he wanted to see what was really the optimum framerate for action movies.
He filmed a bunch of supposedly exciting footage—battle scenes, flying, roller-coaster rides, etc.—at a variety of framerates. Then he showed them to a number of volunteers after wiring them up like a polygraph—heart rate, respiration, skin galvanic response. He found that people's measurable "excitement" did indeed increase with framerate, presumably because it seemed more realistic or involving, until he hit 60 fps. After that there was no further improvement.
As a result of that, I consider it an observed scientific fact that 60 fps is the most the human visual system can handle, and anything over that is overkill—or just one of those meaningless specs that geeks like to quote.
Of course, I could be wrong. When stereo was invented, the brain wasn't supposed to be able to detect different arrival times at each ear, just differing loudness. That turned out to be baloney. Also, the TV I'm looking at right now actually has 1920 X 1080 pixels on the screen. (That was a prerequisite for me, so I'm a very late adopter on HDTV.) In this area we have five 1080i channels and five 720p channels. Doing the arithmetic, from this distance I shouldn't be able to resolve 720 lines, let alone 1080, but I can tell the 720p channels appear dull and lifeless compared with the 1080i ones. I can't explain it, but it's an observed fact for me.
I think someone should revisit Spielberg's experiment in the electronic era. A question that occurs to me is: these new 3D TVs think they need 120 Hz for 3D, but I wonder if alternating 30 fps per eye would add up to a perceived 60 fps?
Doug Trumbull, the great film effects artist who created the Star Gate sequence for 2001, the spacecraft for Close Encounters and a lot of the effects in Blade Runner (among a great deal of other notable work) at one point was trying to get his "Showscan" process off the ground.
The story goes that he was editing some 70mm footage shot at high speed on a flatbed, fast forwarding through a sequence, and was struck by how footage shot at high speeds viewed at high speeds (which then appears to run at "normal" speed, instead of slow motion) took on an almost 3D quality. His subsequent experimentation led him to conclude that the optimal effect was achieved on 70mm film shot and viewed at 60fps.
He directed Natalie Wood's last movie, "Brainstorm", intending it to be a showcase for the new process (with most of the film shot and projected conventionally and "brainstorm" sequences, depicting direct experience of other people's memories, in Showscan) but he couldn't get enough theater owners to invest in the considerable expense of retrofitting their projectors. He went on to use Showscan for showrides in amusement parks.
I've always wondered why, given the change over to digital acquisition and projection, there hasn't been a reinvestigation of increasing frame rates for both shooting and projection this way. There aren't any electromechanical issues as in film projection, doubling your data storage costs isn't particularly burdensome, and it gives you denser information per second (with more visible impact) than just boosting resolution. It certainly would be easier to implement (and probably easier to enjoy) than the benighted 3D craze.