What is V-Sync and when should I enable it? [duplicate]

Most modern 3D games have a graphics option "V-Sync". It's usually an on/off setting. I'd like to understand settings I tweak with, and this one kind of eludes me.

What does "V-Sync" mean, and when should I turn it on?


I've tried a few things to answer my own question:

  • Search arqade. There's a more in-depth question about which way to go about enabling it, another specific question about why limiting FPS to screen refresh rate won't prevent tearing, as well as several other game- or hardware-specific ones, but no real question or answer that explains what it is or when to use it at all.
  • Use Google search, which leads to a decent yet very summary forum post on tomshardware. It does explain to some degree what it is, but not in very much detail nor anything about when you'd turn it on.
  • A sub-entry on Wikipedia, which is (a) rather short and (b) obviously encyclopedic instead of aimed at a gamer deciding whether to use this option.

Finally I've tried messing with the setting in various games, mainly when I saw and googled about the "tearing" of images I had in some games, but a gamer-oriented explanation of the what and when still eludes me...

As a gamer to a gamer: please tell me what the option is and give me a rule of thumb of when I should (and should not) consider turning it on/off?


Solution 1:

V-Sync is short for "Vertical Synchronization"; its only purpose is to avoid screen tearing in games.

What is screen tearing?

Image depicting a simulated tear Image by Vanessaezekowitz, via Wikimedia Commons. Used under the CC-By-SA 3.0 license.

Screen tearing happens, because your GPU sends a frame to your screen when the latter hasn't yet finished displaying its previous frame. You are essentially seeing part of one image and part of another at the same time. Since both images often look very similar, it looks like the picture has been torn apart, hence the name "screen tearing".

V-Sync ensures the GPU doesn't send a frame while the screen is busy. There are various ways to achieve this. The most well known use double buffering or triple buffering.

When using double buffering, the GPU uses two frame buffers; the "front buffer" in which it stores the frame being sent to the screen, and the "back buffer" in which it stores the next image to be displayed.

Unlike double buffering, triple buffering uses two back buffers. Once the GPU is done with the next frame, it can start working on the second back buffer. If the screen still isn't ready by the time the GPU has filled both back buffers, the GPU can now safely overwrite the first back buffer. The advantage of this, is that it reduces the lag mentioned in point 2 (see below). The disadvantage, is the higher memory requirement of having one additional buffer.

Recently, a new technology has been introduced intending to replace V-Sync: G-Sync (Nvidia) and FreeSync (AMD). Just like V-Sync, their purpose is to eliminate screen tearing, but now the screen will wait for the GPU, if the latter can't provide frames fast enough, essentially eliminating point 3 below.


From a gamer point of view, V-Sync does the following:

  1. It eliminates screen tearing. At least it should; I've seen games with buggy V-Sync where turning this option on did not remove tearing entirely.

  2. It introduces stuttering, or lag. Since your GPU now has to wait for the screen to be ready, the frame you'll see on screen will almost never be the most up to date. Most people don't notice it and/or find the screen tearing to be worse distraction than the former.

  3. It will affect your framerate if your GPU can't match your screen's refresh rate. For instance it'll try to feed a 60 Hz screen with one frame every 16.6 ms (1/60 second). If it can't produce a frame fast enough, it'll have to wait for the screen's next refresh cycle before it can display it. If this happens frequently, i.e. your GPU can't provide frames every 16.6 ms, then your effective frame rate will bounce between integer multiples of the refresh, that is, at times it will be 30 instead of 60 for example.

As a rule of thumb:

If you're in the majority and own a typical 60 Hz display:

  • If you play first-person shooter games competitively, and/or have issues with perceived input lag, and/or if your system cannot sustain at least 60 FPS in a given title, and/or you're benchmarking your graphics card, then you should turn V-sync off.
  • If none of the above applies to you and you experience significant screen tearing, then you should turn V-sync on.
  • As a general rule, or if you don’t feel strongly either way, just keep V-sync off.

On the other hand if you own a gaming-oriented 120/144 Hz display (if you have one, there's a good chance you bought it specifically for its higher refresh rate):

  • You should consider leaving V-sync on only when playing older games, where you experience a sustained >120 FPS and you are experiencing screen tearing.

If the lag disturbs you more than the tearing, you probably want to keep it off.
If you still see tearing even with V-Sync on — such as in games where V-Sync implementation is buggy — you might want to turn it off to see if the tearing gets worse; if it doesn't, then V-Sync probably doesn't work at all (in that particular game) and there's no reason to keep it on.

Solution 2:

VSync is a feature that affects your FPS (Frames per Second). Instead of wasting power trying to achieve a higher FPS than your monitor can handle, it detects the refresh rate of your monitor, (in my case, 60 Hz), and then automatically restricts your FPS to that number. So if you have multiple monitors, and you use Minecraft between them, by using this feature, you can easily get the best frame quality, that is, if your computer can handle it. You can read more here: http://minecraft.gamepedia.com/Options

Edit: Yes, after reading your post again, tearing will occur if you set your frame rate above the refresh rate of your monitor. Setting VSync is the best way to avoid this.