What's the difference between lag and jitter?

I've noticed nowadays that games (and people) only use the term 'lag'.
Except, while playing BZFlag, I've noticed there's two notifications for latency (the proper term 'lag'); "Your jitter is too high (### ms), warning #/#" and "Your latency is too high (### ms), warning #/#".

I've noticed that sometimes, only the "lag" notification comes out along with the "jitter" notification nowadays. But I've remembered back then (around 2008), that only the "jitter" notification comes up.

I've googled up the two and their wikipedia pages and Jitter talks about difference between the interval that the packet's time measure starts, alongside frequencies while the other is about the time difference in receiving, processing and then return-sending a single packet.
There again, I could be reading the incorrect definition for that application of the term.

Is there a difference between "Jitter" and "Latency" (if so, what?), or are they the same thing?


Lag is a noticeable delay between the time something is initiated and the time when it happens. For example, pressing an "attack" button and finding that the attack doesn't happen until a second later.

Latency is sometimes used to mean the same thing as lag, but in networking it's generally used interchangeably with ping time: the amount of time it takes for a packet to travel from point A to point B, or to travel there and back again. High packet latency generally leads to lag in a game.

Jitter is a variance in latency over time. If every packet takes exactly the same amount of time to travel from A to B, there is no jitter. If the packet delivery times are inconsistent, we call it jitter.

Jitter can be overcome with buffering, but that adds to overall latency/lag. Overcoming a lot of jitter might require buffers so large that the resulting lag would make a game terribly unresponsive, possibly not worth playing.


A 'lag' is caused by a delay in the network, usually in multiplayer or MMO games:

  • You move.
  • Your PC sends this information (player moved to xyz) to the server (takes time)
  • The server sends this information to everyone else (takes time as well)
  • Everyone else finally sees that you moved.

The delay that happens between you moving (locally on your machine) and other players finally seeing your new location is dependent on the network speed. This is usually fast enough and the normal speed is the "latency" of the network, but if something is blocking your (or the other player's) communication with the server then this can delay these updates considerably (seconds) so they will see you "jump around".

'Jitter' on the other hand is caused not by blocking of the network but (most commonly) by blocking the CPU. A game is running a loop that continuously updates the screen. If some other background program is blocking the CPU for a longer period (e.g. seconds) then this loop will be delayed and you will see the game pause for short periods of time.

Note: Jitter is also used for another thing (not gaming related). It can mean small errors in transmitted or captured data or even data modified on purpose (e.g. as dithering). This usually is not a problem in gaming as network packages are checked (with a check sum) and bad packages are re-sent or discarded.