Can I normalize KL-divergence to be $\leq 1$?

The Kullback-Leibler divergence has a strong relationship with mutual information, and mutual information has a number of normalized variants. Is there some similar, entropy-like value that I can use to normalize KL-divergence such that the normalized KL-divergence is bounded above by 1 (and below by 0)?


In the most general class of distributions your multiplicative normalization approach is not possible because one can trivially select the comparison density to be zero in some interval leading to an unbounded divergence. Therefore your approach only makes sense with positive densities, or where both share the same support (perhaps densities of the same class?)

Instead you might consider normalization through a nonlinear transformation such as $1-\exp(-D_{KL})$.


The KL is only defined for distributions having compatible supports... Nevertheless, it is true that one can find distributions that have infinite KL divergence.