Some case when the central limit theorem fails

There are various ways in which the CLT can "fail", depending on which hypotheses are violated. Here's one. Suppose $X_k$ are independent random variables with $E[X_k] = \mu_k$ and variances $\sigma_k^2$, and let $s_n^2 = \sum_{k=1}^n \sigma_k^2$ and $S_n = \sum_{k=1}^n (X_k - \mu_k)$. Suppose also that $\max_{k \le n} \sigma_k/s_n \to 0$ as $n \to \infty$ (so in that sense no $X_k$ is "dominant" in $S_n$). Then Lindeberg's condition is both necessary and sufficient for $S_n/s_n$ to converge in distribution to ${\mathscr N}(0,1)$.

EDIT: Here's a nice example where the Central Limit Theorem fails. Let $X_n$ be independent with $P(X_n = 2^n) = P(X_n = -2^n) = 2^{-2n-1}$, $P(X_n = 0) = 1 - 2^{-2n}$. Thus $E[X_n] = 0$ and $\sigma_n = 1$. But $$P(S_n = 0) \ge P(X_j = 0 \text{ for all }j) > 1 - \sum_{j=1}^\infty 2^{-2j} = 2/3$$


This doesn't directly answer your question as to what the quote is saying but a good example to keep in mind as an example when the CLT fails.

If $\{X_k\}_{k=1}^{n}$ are independent and identically distributed standard Cauchy random variables, then the sample mean $\left(\displaystyle \sum_{k=1}^n X_k \right)/n$ has the same standard Cauchy distribution. This highlights the importance of finite variance in the CLT.


A related heuristics is as follows. Consider an i.i.d. sequence $(X_n)_{n\geqslant1}$ with $\mathrm P(X_n\gt0)\ne0$. For every $n\geqslant2$, call $T_n$ the maximum of the set $\{X_k\mid 1\leqslant k\leqslant n\}$ and $U_n$ its second maximal value. (In case of a tie at the maximum, let $U_n=T_n$.)

If $\mathrm P(T_n\leqslant xU_n)\to0$ for every $x\geqslant1$ when $n\to\infty$, that is, if $T_n/U_n\to\infty$ in probability, then the central limit theorem does not hold for the sequence $(X_n)_{n\geqslant1}$.