Why do video game framerates need to be so much higher than TV and cinema framerates?

Two reasons:

1. Responsiveness of input

There is a big difference in feel of the gameplay when input & response happen only 24 times per second vs. 60 times per second, especially for fast-paced games such as first person shooters.

Network buffers and input buffers are filled on separate threads, which means new state from the game server, or button presses from your gamepad, must wait until the next iteration in the game engine's "update loop". This wait can be as long as 42 ms for 24 updates per second, while only 16 ms for 60 updates per second. That's a 26 ms difference or roughly 25% of the "lag" we experience on a 150 ms server connection vs. a 50 ms server connection.

2. Lack of physically accurate motion blur

Cameras in the real-world have what's called a shutter, which is open for a continuous range of time, determined by the "shutter angle" or "shutter speed". For instance a moving picture captured at 24 frames per second might have the shutter open for 0.02083 seconds per frame (1/48 of a second, or 180° shutter angle). This continuous interval of time captures and blends all motion happening therein, leading to what we see as motion blur.

Games on the other hand, render only an instantaneous moment of time. There is no equivalent interval where motion is recorded and blended, and instead you create what is essentially a crystal clear sample of the world at a particular instant -- something that is not possible in the real world. Because no motion is recorded in the rendered frame, movement on-screen can look jerky unless the frame rate is increased to compensate (by capturing more inbetween motion). By increasing frame rate you essentially converge on real life "frame rates", leaving us with the biological motion blur we get from our eyes (which are like shutters that are always open).

Though modern games feature "motion blur" now, this only captures motion blur under certain assumptions, and does not (yet) fully recreate the motion blur we see in film or in high-quality CGI renderings.

Demos

For 15fps vs. 30fps vs. 60fps see this demo or this video.

See also

  • Latency in simulators and simulations
  • Motion Blur
  • Shutter speed

I think there is a piece of history you're missing here, so allow me to try and fill it in.

If you google 60fps vs 24fps you'll find endless threads of people asking what the difference is. Most people will tell you that 24fps has been the standard since the 20s, but there is little explanation as to why.

If we actually look back to the creation of film will notice that 24fps has not always been the standard. Edison himself originally recommended 48 fps stating "anything less will strain the eye." Edison films, however, did not follow this standard, nor did they seem to be standardized at all (with single films having a variation of larger than 10fps over the course of the film). American Mutoscope, one of Edison's rivals, actually used 40fps, but the resulting camera weighed almost a ton.

However, these fast paced filmed used up too much film (a luxury at the time) and by Victor Milner's time the standard was 16 fps. More than practical considerations, many film buffs actually critiqued films faster than 16fps as being "too fast." The Birth of a Nation, for example, got as slow as 12fps in some sections.

The major problem with the period between 1910 and 1920, was that film speed varied so much. Even for a single filmographer their frame rates tended to vary between films. By the mid 20s, camera men had started to pride themselves on their even speed and being able to approximate 16fps (which they more usually measured in feet of film). Meanwhile, theaters had started demanding faster and faster speeds. While 16 fps may have looked more professional, in a crowed theater (or a small one), the audience seemed more able to decern the film at 24 frames per second.

When Annapolis was shot in 1928, the studio was mandating 24 frames per second. While many film crews did not appreciate the more frequent camera reloads (16 frames per second corresponds to about 1,000 ft in 16 minutes). By the time motorized cameras became common, 24 frames had become a de facto standard.

Its important to note, this was not a technical limitation, nor was it (frequently) a financial one. It was the end result of two opposing forces (camera crews and actors vs studios and theaters) desiring a different speed.

So Why not 60?

It's worth noting that many TVs (eg NTSC) use 59.94 frames per second (60 Hz/1.001) counting interlacing. If you discount interlacing it's 29.97 fps. This has to do with how the interlacing is actually implemented (specifically, to remove beating based on 60hz power sources found in the US). Originally they chose 60fps to match the power source, but this actually causes intermodulation (beating), which appears as flickering.

There is some evidence to suggest that human visual acuity drops off sharply after about 30 frames per second, though most human beings can still detect discontinuities in the motion illusion up to 60-75 frames per second. What's more there is a large library of evidence that the human eye can detect jitter over 300 frames per second (Steinmetz 1996). So it a decent question to ask, why not 60? 60fps itself is an artifact of different technology (Television using 30 Frames per second and interlacing frames).

Ok, so we were forced into 60fps, why keep our 24 fps standard?

When making home movies first became a possible consideration (read VCR camcorders, my father had one for years, the thing actually took a VCR tape and wrote to it), they were optimized for TV production (ie. 60fps). As a result home movies had a vastly superior frame rate to standard film. Unfortunately, this quickly became associated with amateur production (which most home movies were). Consider movies which feature film shot on a hand held camera. Most people can instantly discern the much faster rate, but more surprisingly is that most people will tell you it looks lower quality.

The truth is, we think of 24fps as looking better because we've been trained to.

A number of Directors have tried to break away from 24fps (Peter Jackson shooting at 48, James Cameron at 60), but almost always they are forced to show these movies at the old standard. People just think it looks better. Film speed (like many things) is a social phenomena.


I hate to cite the Wikipedia entry for frame rate, but it makes sense:

In modern action-oriented games where players must visually track animated objects and react quickly, frame rates of between 30 to 60 FPS are considered acceptable by most, though this can vary significantly from game to game.

Watching film and television is an extremely passive activity.

Playing a video game, on the other hand, requires active participation.

However, then you have How Many Frames can the Humans See? which notes that the real issue is motion blur.

If you could see your moving hand very clear and crisp, then your eye needed to make more snapshots of it to make it look fluid. If you had a movie with 50 very sharp and crisp images per second, your eye would make out lots of details from time to time and you had the feeling, that the movie is stuttering.

Just think of modern games: Have you ever played Quake with 18fps? There is no motion blur in those games, thus you need a lot of frames per second more.


I'm no expert on the subject, but here is why it makes sense to me that real-world recordings can run at fewer fps than animations can while still being higher quality: An animated frame shows a single instant in time while a recorded frame shows a small interval of time. This is not the same as just blurring parts of the picture with motion in it.

This is why: Suppose the interval of time that you get in a recorded frame is 1 millisecond and suppose the universe runs at, say, 1 billion fps (the actual number is the planck time, but let's not digress). Then the recorded frame is the average of 1 million points of time so that the 1 frame is actually based on a tremendous amount of information. In contrast the animated frame has just the information from a single instant in time. So don't think of the recorded frame as just 1 frame, think of it as a summary of a million frames.

From that perspective it makes a lot of sense that animation must run at a higher fps than recordings need to. You could simulate the effect by running the computer at 1 billion fps and average that down to just 24 fps. I'm sure 1 billion fps would be overkill for that purpose, and it would be interesting to know at what point diminishing returns kick in. That might be a very low number like 60 or 100.

So recorded frames are more blurred than animated ones. The blur in recorded frames carry a lot of extra information about what happens between frames, while just adding blur to an animated frame removes information. This is similar to the difference between blurring and anti-aliasing, except we are working with time instead of space.