Why would you take the logarithmic derivative of a generating function?

Solution 1:

There is more than one way to interpret the logarithmic derivative. One way is that it is a sequence transform related to sequence recursions. For example, suppose that we have two sequences with corresponding exponential generating functions $ A(x) = \sum_{n=0}^\infty a_n x^n/n!, \, B(x) = \sum_{n=0}^\infty b_n x^n/n! $ related such that $\, A\,'(x) = A(x) B(x). \,$ This means that $\, a_{n+1} = \sum_{k=0}^n {n \choose k} a_k b_{n-k} \,$ which is a recursion for sequence $\,a\,$ using binomial convolution with the other sequence $\,b.\,$ Another way to write the relation between the generating functions is that $\, B(x) = \log(A(x))'. \,$ Thus, $\, B(x) \,$ is the logarithmic derivative of $\, A(x). \,$ Turning this around we have $\, A(x) = \exp\big(\int B(x)\, dx\big). \,$

A simple example of this is for OEIS sequence A000085 which is the number of permutations that are involutions. One recursion is $\, a_{n+1} = a_n + n\, a_{n-1} \,$ which corresponds to $\, b_0 = b_1 = 1. \,$ Thus, the exponential generating function of the sequence is $\, A(x) = \exp(x + x^2/2!). \,$

Another simple example is for OEIS sequence A182386 which is related to derangments. One simple recursion is $\, a_{n+1} = -(n+1)a_n + 1, \,$ but more useful for our purpose is the recursion $\, a_{n+1} = \sum_{k=1}^n {n \choose k} (-1)^k k! \, a_{n-k} \,$ which implies that the exponential generating function of the sequence is $\, \exp(x)/(1+x). \,$

Solution 2:

For starters, let's talk about why you might want to take the logarithm of an exponential generating function. The starting point here is the exponential formula, one version of which, roughly speaking, says that if $A(x) = \sum a_n \frac{x^n}{n!}$ is the exponential generating function of structures of some kind (e.g. graphs) which have a decomposition into connected components, then $\log A(x)$ is the exponential generating function of connected structures (e.g. connected graphs). This is a powerful and general result and has many applications, in both directions (taking logs and taking exponentials). As a simple example, the EGF for the number of ways to partition a set into subsets with cardinalities lying in some $S \subseteq \mathbb{N}$ is

$$\exp \left( \sum_{n \in S} \frac{x^n}{n!} \right).$$

The exponential formula comes in a "cyclic form" where instead of thinking of $\log A(x)$ as an exponential generating function we write it in the form $\sum b_n \frac{x^n}{n}$; see this blog post for full details. This version of the exponential formula implies, for example, that the EGF for the number of permutations in $S_n$ whose cycles have cardinalities lying in some $S \subseteq \mathbb{N}$ is

$$\exp \left( \sum_{n \in S} \frac{x^n}{n} \right).$$

This version of the exponential formula is more relevant to taking logarithmic derivatives since taking the derivative of the logarithm removes the factor of $n$.

There's a lot more to say here, including a general interpretation of what it means to compose generating functions; for more see the first half of Analytic Combinatorics.

Solution 3:

Let $X$ be a real-valued random variable. Then we have $$ \operatorname E(e^{tX}) = 1 + m_1 t + m_2 \frac{t^2} 2 + m_3 \frac {t^3} 6 + m_4 \frac{t^4}{24} + \cdots $$ and $m_k = \operatorname E(X^k)$ is the $k$th moment of the probability distribution of $X.$

The $k$th central moment of the distribution is $\mu_k(X)=\operatorname E((X-m_1)^k).$ The central moment enjoys the properties of shift invariance, which means $\mu_k(X+c) = \mu_k(X)$ for constants $c,$ and homogeneity, which means $\mu_k(cX) = c^k \mu_k (X).$ But only when $k=2\text{ or }3$ does it enjoy the property of additivity, which means that if $X_1,\ldots, X_n$ are independent random variables, then $\mu_k(X_1+\cdots+X_n) = \mu_k(X_1)+\cdots+\mu_k(X_n).$

However, for each $k\ge2$ there is a $k$th-degree polynomial in the first $k$ moments that simultaneously has all three properties. It is called the $k$th cumulant. The fourth cumulant is the fourth central moment minus $3$ times the square of the second central moment. (The second and third cumulants are merely the second and third central moments.) For $k\ge2,$ the cumulants are fully characterized by this description plus the condition that the coefficient of the $k$th moment is $1.$

Theorem: The exponential generating function of the sequence of cumulants (where the $1$st cumulant is $m_1$ as defined above, so it is shift-equivariant rather than shift-invariant like the higher cumulants) is the logarithm of the exponential generating function of the moments.

Solution 4:

One thing to add on to Michael's excellent answer.

A random variable $X$ (with all cumulants) is normal iff $\kappa_n(X)=0$ for all $n\ge 3$. Furthermore, a sequence of random variables $X_k$ (with all cumulants) converges in distribution to a normal distribution iff $\kappa_n(X_k)\to 0$ for all $n\ge 3$.

This is often in practice much easier to show than to show all moments converge to respective moments of a normal distribution.