Is there any reason to limit my FPS?

I recently built a new computer. It's not mind-blowingly amazing but it's pretty solid, and I can play any game in my Steam library on it. I've been messing around with graphics settings in Borderlands 2. Right now, I have everything on high, with the FPS set at unlimited. The FPS-counter wildly varies from 60-200. Although during normal gameplay, it stays pretty close to 70-90. I have not experienced any screen tearing. I've noticed that there are settings for limiting the frames per second. The settings are:

smoothed 22-62
capped 30/50/60/72/120
unlimited

Is there any reason to NOT go with unlimited?


Capping your framerate can have a few benefits:

  • Decreased energy consumption
  • Decreased heat production
  • Decreased noise (cooling fans run slower)

Capping your framerate is especially beneficial to laptops or any other sort of mobile computers as it provides an excellent way to keep a laptop from eating its battery alive and also from burning a hole in your crotch.

Keep in mind that capping your framerate isn't the same as using v-sync.

Using a framerate cap will not provide a reduction in screen tearing. Framerate caps simply throttle a number of frames your video card can produce; they do not force the video card to wait until the monitor has begun a new refresh cycle.

If the framerate you are producing is around 55-75 fps then you might want to cap it a 50 or 60 as the spikes where it is 75 will feel like your game is slowing down when it goes back down to 55, this is just an example and your actual frames per second may vary.

Using v-sync conveys all the benefits listed about framerate caps above, with the added benefit of eliminating screen tearing; however, it does have the drawback of adding some latency.


Summary

Generating more frames than your monitor can display is a waste of energy; frame-tearing can only be eliminated by video-syncing, but it can be minimized by capping FPS at the monitor frequency. However, input-latency is a related consideration.

Detail

Drawing on my experience writing GUI systems, it seems to me that the following must be true.

Given two quantized systems, (a) image generation, and (b) image rendering, any disparity between the rates must on occasion result in tearing if they are not mutually excluded from occurring concurrently (by syncing the GPU and monitor), and any FPS in excess of the physical limits of the monitor is wasted. Ideally, an exact match in rate, offset by half the interval would minimize or eliminate tearing and waste no compute.

This is easily visualized, as follows, assuming 62.5 Hz monitor and whole milliseconds for simplicity:

Time (16'ms)  : 0---------------1---------------2---------------3
Monitor Frames: x---------------x---------------x---------------x

GPU @ 62.5 FPS: x---------------x---------------x---------------x
GPU @ 62.5 FPS: --------x---------------x---------------x--------

GPU @ 125 FPS : x-------x-------x-------x-------x-------x-------x
GPU @ 125 FPS : ----x-------x-------x-------x-------x-------x----

GPU @ 90 FPS  : x----------x----------x----------x-----------x---
GPU @ 90 FPS  : ------x----------x----------x-----------x--------

Monitor Frames: x---------------x---------------x---------------x
Time (16'ms)  : 0---------------1---------------2---------------3

As can be seen, if the monitor is 62.5 Hz, then the optimal frame rate is 62.5 FPS; at 90 FPS (which is actually one frame every 11.11' ms) we have a mismatch that will cause one lost frame about every third interval; at 125 FPS we have one frame lost every interval. If you never saw it, was the frame needed?

Thus, given a frame-rate constraint on the physical device, no rate greater than the monitor capability can be perceived since additional frames are simply never seen. From that it seems logical that capping FPS at the monitor refresh rate results in the maximal perceivable motion quality. Therefore it seems logical that generation of any frames over-and-above the monitor's refresh rate will simply waste electricity.

Given that you can't prevent tearing no matter what you do without monitor syncing, it's still much better to cap the frame rate and put that compute power into generating more detail per frame, and once that maximum is reached, just save the energy and reduce waste heat.

My two cents.

Visual Acuity

Playing into this, as well, even if you have a 144 Hz monitor is what you can actually perceive. Most data indicates that FPS matters only up to about 60 to 120 Hz, with conclusions that 90 Hz (FPS) for most people is the cut off for visually perceptible improvement. But every individual is unique, and regular gamers are among the most sensitive to motion artifacts because your visual system can be trained.

A nice article on the subject is http://www.pcgamer.com/how-many-frames-per-second-can-the-human-eye-really-see/

Input Latency

(Thanks to Atli for this comment.)

With all that said, it's worth pointing out here that capping FPS to the screen refresh can create noticeable input latency when gaming. IE: with the 16ms frame refresh, if the frame is rendered in 1ms immediately after the refresh, it'll have to sit there for 15ms waiting, effectively creating a 15ms lag between movement and display of that movement.

Games are also prone to limiting their AI/logic cycles to rendering cycles, creating lag there as well. Rendering at 3x the screen refresh will make this a lot less noticeable, as the frame that is shown will render closer to display time.

For such a use-case, capping at some appropriate multiple of the screen refresh rate can still be worthwhile with a modern GPU while keeping latency under some desired threshold.