Can someone explain to me the correlation between bit rate and frame rate?

I understand the higher the FPS, the smoother the image will look at your current resolution. This also corresponds to refresh rate over the monitor. Your FPS can only go as high as your refresh rate 60hz 120 hz, etc until you get weird effects. (If this wrong, please explain also)

How does bit rate fall into play? How does it relate to FPS?


Solution 1:

Let me take a completely different approach, to add to Johannes' answer.

FPS or "frames per second" originates in the film world. 35mm film used to run run at 24 frames per second. In other words, the motion you saw on screen as continuous was actually due to "persistence of vision" creating a fluid motion out of 24 discrete images or Frames. With video came other frame rates. Countries with 110volts electricity at 60 Hz chose 30 fps as it was easy to keep perfect time this way. Other countries with 50Hz chose 25 frames per second (see the relation?).

When Digital video and encoding came into the picture, we started talking in terms of bitrates or the amount of information contained (in a frame or per second). With the same kind of encoding algorithm, higher bitrates usually offer better quality. But the same amount of data in a smaller frame (=fewer pixels) also means better quality. And obviously if measured in seconds, you add more data if there are more frames per second.

Conversely, if you are using say 1024 KBps (kilobytes per second), then for 60 fps it gives lower quality than at 24 fps, because there's fewer frames in 24fps over which this 1024 is distributed, that is 1024/24 = 42.6 KB per frame. With 60 fps, it is only 17.06 kilobytes per frame.

Although this is simplistic in many ways, it does give a general idea.

Solution 2:

FPS are usually the frames per second that your graphics card generates, so it can go higher than your monitor's refresh rate and if a new frame is generated while the monitor is refreshing the image, it switches to the new frame (so parts of the screen that were already refreshed in the current cycle show the last frame, parts that have not been refreshed yet show the next one). Therefore, you can even tell a difference (although minor) between 60 FPS and 90 FPS, even on a 60 Hz monitor. Usually though, the effect is more of a negative nature; you may see artifacts, known as "tearing", as a result of switching the frame mid-image. And having a higher framerate is usually totally useless, unless you're into pro-gaming (and even there, the refresh lag from your monitor's pixels most likely has a bigger impact).

Bitrate measures how many bits per second your graphics card can output. The link between FPS and BPS is, therefore, the size of one frame, and how many bits you need to encode one pixel.
Modern systems use 32bit Colors. So if we assume you're working with a resolution of, let's say, 1000x100 (not that that one is common, but it's easy to do math with ;) ), a PBS of 320 000 000 PBS leads to 100 FPS (320 000 000 / (1000x100x32) = 100).