Why is gzip compression varying in efficiency in IIS?

The compression of static files is handled dynamically (if dynamic compression is enabled) while the file is considered infrequent by IIS. Once the file is considered frequent it will be compressed and cached (if static compression is enabled). The cached version will continue to be served until it becomes infrequent again. There are 2 config settings you can use in IIS to configure frequent files:

system.webServer/serverRuntime:

  • frequentHitThreshold: How many times should the same file be requested, before it's considered frequent and cached? Defaults to 2.
  • frequentHitTimePeriod: Time interval in which the same file should be requested {frequentHitThreshold} times, in order to be cached. Defaults to 10 seconds.

Beware that regardless of the frequentHitTimePeriod you set, a frequent file will always become infrequent if it is not requested after 1 minute. I have no idea if there is a setting for this in the config.

Setting frequentHitThreshold to 1, for example, will mean that the file is always considered frequent by IIS, even from the first request. This will in turn bypass the dynamic compression and be treated only by static compression.

Note that the compression levels for dynamic (default 0) and static (default 7) compression are different so will return 2 different file sizes.

Also and this is why I got into this issue in the first place: the ETag for the same file is different between dynamic and static compression even if you use the same levels for both.

Hope this helps.


Apparently, on the first request for a static file, IIS does not have a compressed copy of the file in it's compressed file cache, so it uses dynamic compression on that request. This can be resolved by setting the serverRuntime element's frequentHitTHreshold attribute to 1.

This is discussed in detail here. This setting is probably only worth changing if serving a CDN.