Difference between cl_interp_ratio 1 and 2
There are plenty of forum posts on the "perfect" cl_interp
but no one has really mentioned cl_interp_ratio
. Most players set it to 1
or 2 (default)
, but not many explain why.
What is the reason behind this? What are the differences between the values? What does cl_interp_ratio
even do?
To fully understand cl_interp
and cl_interp_ratio
, you need to understand what these numbers are actually doing behind the scenes. Setting them randomly can actually make your connection worse off than it was with the defaults.
I'm going to give a short overview of Interpolation, what this means in terms of games on the Source engine (such as TF2), what the different values of cl_interp
and cl_interp_ratio
mean, and how to test your connection to find the value that works for your connection.
Interpolation
Interpolation is a mathematical term. It is a means of guesstimating data points based on previous data points. As a very simplistic example, imagine you're waiting to receive eight numbers, but instead you only receive five:
4, 8, ?, ?, 20, 24, ?, 32
Now, based on the data you did receive, what do you assume the missing 3 numbers are? If you said 12
, 16
, and 28
, that is because you recognised the pattern was going up by 4, and were able to assume that the pattern would stay the same.
When we talk about Interpolation in regards to games, it's usually as a form of lag compensation for dropped data. The game receives data from the server continuously, however it may lose some packets of data along the way. According to Valve:
A multiplayer client will typically need to render three or more frames with each server update it receives (assuming 60fps+ and
cl_updaterate 20
).
Source's interpolation system prevents the jittery motion this would ordinarily lead to by buffering server updates then playing them back with the gaps smoothly interpolated between. It can also protect against glitches caused by packet loss.
Valve - Interpolation
What this means in Source games (TF2)
Without Interpolating, other player's would seem to 'stutter' or 'jitter' around on screen as they run, especially on bad connections that regularly drop packets. However, the implementation of Interpolation adds artificial latency to a player's view of the game world, as it needs to buffer a few updates in order to interpolate missing ones and display them all smoothly. So fiddling with these values is a balance of regular latency (bad connections) vs artificial latency (interpolation)
So what does this mean for cl_interp
and cl_interp_ratio
?
-
cl_interp
should (almost always) be 0, as this will ensure that your client is tuned to the precise update rate of the server. Changing this value affects the minimum interpolation delay ("lerp"), and raising it will increase lerp, and therefore increase the artificial latency.
By default it is set tocl_interp 0.1
, which is 100ms of lag, and a carryover from the days of dialup internet. -
cl_interp_ratio
Can vary, and is the setting you want to fiddle with in order to tweak your interpolation settings.
By setting cl_interp_ratio
to 1, you’re only using one update from the server to guess what the next (missing) one is. If you happen to drop two updates in a row, then the client is forced to randomly guess (which leads to 'jittering' positions). So this value really depends on the quality of your connection.
What you should use
-
cl_interp_ratio 1
if you have little to no packet loss. This buffers one server update for interpolation. -
cl_interp_ratio 2
for connections with light packet loss. This buffers two updates, and helps with clients that may regularly drop more than one packet. -
cl_interp_ratio 3
(or even 4) for heavy packet loss. This will buffer 3 (or 4) updates, and is used for clients with very high packet loss.
You can also use real numbers, not just integers, for example cl_interp_ratio 1.6
, however this just adds interp time without actually using more updates. Most people stick to whole integers.
How to test
You can turn on your Net Graph in TF2 using net_graph 1
from the developer console. This will show your current lerp value:
- If you haven't adjusted any settings, your netgraph will report your lerp at 100ms.
- If you set your
cl_interp_ratio 1
it should say around 15ms. - If it is orange in color, this means that you set your ratio lower than 2.
- If it turns yellow, it's just warning you that it may be lower compared to the server.
- If the text remains white then TF2 thinks that is a safe setting.
Of course, you can also just test by spending time playing on different settings and tweaking it if you run into any issues, but netgraph will give you a more precise overview. You should leave it open while you play some games, and keep an eye on the value.
Conclusions
To directly answer your question: the differences between cl_interp_ratio 1
and cl_interp_ratio 2
is the difference between buffering one or two updates from the server, in order to compensate for bad network connections and dropped/missing data.
You should only fiddle with cl_interp_ratio
(not cl_interp
), and generally cl_interp_ratio
should be a number between 1 and 4.
References
- Wikipedia: Interpolation
- Valve: Interpolation
- Valve: Entity Interpolation
- Steam Community: cl_interp and cl_interp_ratio explained
- Reddit: Has anyone tried cl_interp_ratio 2
- Reddit: A small guide on cl_interp and how it influences your aim/hit detection