Characterization of normal distribution
I am sorry if this question is vague since I am completely unfamiliar with probability theory.
Suppose that we have a family of real-valued random variables $X_n$ (say, all of them have mean 0) on some probability space and we would like to show that $X_n$ converges to Gaussian weakly.
What are the standard/general techniques of showing such convergence to Gaussian distribution?
I am aware of the following two:
- Moments. Check that $E[X_n^k]$ converges to the $k$-th moment of Gaussian for all $k \in \mathbb{N}$.
- Work with characteristic functions instead. This seems to be the method to prove for example, the classical central limit theorem.
- Stein's method also seems to be common in practice these days, although I am not sure if this is a real workhorse in probability theory.
and I just want to know if this is how people usually approach this in probability theory, or there are some other general approaches as well.
Along the same lines, I also want to know about the techniques of showing convergence of complex random variables to the complex Gaussian. This can probably be reduced to the real case by separating the real/complex part and checking their covariance, but I would be curious to know if there is some uniform way of viewing this as well.
Thank you!
Solution 1:
I think the three methods you listed pretty much cover it.
However I'd like to stress that Stein's method is, to use your terms, a real workhorse in probability theory, especially in stochastic geometry. There are two advantages to Stein's method:
- It gives rates of convergence. For example the Berry-Essen theorem that states that if $(X_{i})$ are iid variables with finite third moments, $S_{n}=n^{-1/2}\sum^{n}_{i=0}X_{i}$ and $N$ has centered reduced normal distribution then \begin{equation*} \sup_{x\in\mathbb{R}} \mathbb{P}(S_{n}\leqslant x) - \mathbb{P}(N\leqslant x) \leqslant C\mathbb{E}(|X|^{3}) n^{-1/2} \end{equation*} where $C$ is a universal constant. This is rather difficult to prove with characteristic functions (the original article by Berry and Esseen does it though) but it is very easy with Stein's method.
- It is very robust, in the sense that it still works pretty well in the case that $X_{n}$ does not arise as a sum of independent variables. This is particularily useful in stochastic geometry. Many examples are given in the following introduction to Stein's method.
I still remain rather unhappy with all these methods of proving convergence to the normal law: all seem to tackle the problem indirectly and in my opinion do not help understand what is that makes the normal law a distribution that appears everywhere where a sum of (more or less) independent variables occurs.
I don't have any satisfying answer for your question regarding the complex Gaussian.