What is the connectivity between Boltzmann's entropy expression and Shannon's entropy expression?

Mathematically they only differ in the constant multiplication. Indeed the Shannon entropy of a discrete random variable $X$, in the current literature of information theory, is defined as: $$ H(X)=\mathbb E(-\log P_X(X))=-\sum_{i=1}^{n}p_i\log p_i. $$ The base of the logarithm may vary therefore one can present Shannon entropy as follows: $$ H(X)=C \mathbb E(-\log P_X(X))=-C\sum_{i=1}^{n}p_i\ln p_i $$ Boltzmann entropy is as follows: $$ S=-K\sum_{i=1}^{n}p_i\log p_i. $$ where $K$ is Boltzmann constant.

There is an interesting story behind the term of Entropy in information theory. It was indeed Von Neumann who suggested Shannon to use "Entropy" for its basic mathematical concept.

Beyond the similar appearance, they are used in information theory and statistical physics for different purposes. In information theory, Shannon Entropy is intuitively measure of uncertainty. However there exists other notions such as Renyi Entropy in the literature. It is then used as a fundamental concept for treating the problem of source coding, channel coding and rate distortion etc. The entropy is a fundamental concept here. It is either derived from a set of axioms or is defined directly.

In statistical mechanics, the entropy is derived from other existing concepts such as partition function or Helmholtz free energy.

A deeper connection between two concepts can be explored by finding connections between statistical physics and information theory. Work of Jaynes is a valuable piece connecting these two domains.


For more recent connection between two domains look at:

Information, Physics, and Computation, Marc Mézard and Andrea Montanari

Statistical Physics and Information Theory, Neri Mehrav here and here.


Arieh Ben-Naim, a researcher at the Hebrew University of Jerusalem, has one of the most unifying theories connecting Shannon Entropy with Boltzmann's Entropy. In particular, I recommend his textbook A Farewell to Entropy. Statistical Thermodynamics Based on Information which goes into great detail on this topic and discusses the attempts of others. Additionally E.T. Jaynes' seminal paper on the topic and some of Leon Brillouin's work attempt to link the two, but Ben-Naim has a more refined version and discusses the subtleties of the others.