Showing that the sample mean and variance are independent by showing two distribution are the same

Solution 1:

Using the Gram-Schmidt process, construct an orthonormal basis of ${\mathbb R}^n$ containing the unit vector ${\bf a}:=\frac1{\sqrt n}(1,\ldots,1)^T$. Convert this basis into an orthogonal matrix $A$ with $\bf a$ as its last row. That is, we have found an $n\times n$ matrix $A$ with $AA^T=A^TA=I$ and $A_{ni}=\frac1{\sqrt n}$ for each $i$.

Pack the given variables $X_1,\ldots,X_n$ into a column vector $X$ and define a new vector of random variables $Z:=(Z_1,\ldots,Z_n)^T$ by $Z=AX$. We observe:

  1. The variables $Z_1,\ldots,Z_n$ are also independent standard normal variables, since they are a linear combination of the $X$'s, with mean vector zero and covariance matrix $$\operatorname{Cov}(Z)=E(ZZ^T)=E(AXX^TA^T)=AE(XX^T)A^T=AA^T=I.$$
  2. The final variable $Z_n$ equals $\sqrt n\bar X$, since $$Z_n=(AX)_n=\sum_iA_{ni}X_i=\sum\frac1{\sqrt n} X_i=\frac1{\sqrt n}n\bar X.$$
  3. The sum $\sum_{i=1}^nZ_i^2$ equals $\sum_{i=1}^nX_i^2$, since $$\sum_{i=1}^nZ_i^2=Z^TZ=(AX)^T(AX)=X^TA^TAX=X^TX=\sum_{i=1}^nX_i^2.$$
  4. The variable $\sum_{i=1}^{n-1}Z_i^2$ equals $\sum_{i=1}^nZ_i^2-Z_n^2=\sum_{i=1}^nX_i^2-n(\bar X)^2$ by (2) and (3). This last can be rewritten as $\sum_{i=1}^n(X_i-\bar X)^2$.

We conclude that the pair $(\sqrt n\bar X,\sum_{i=1}^n(X_i-\bar X)^2)$ is identical to the pair $(Z_n,\sum_{i=1}^{n-1}Z_i^2)$, which by (1) has the same distribution as $(X_n,\sum_{i=1}^{n-1}X_i^2)$.


This result can be generalized: Let $X_1,\ldots,X_n$ be iid, each with standard normal distribution. If $\alpha_1,\ldots,\alpha_n$ are constants with $\sum\alpha_i^2=1$, then $\sum_{i=1}^nX_i^2-\left(\sum_{i=1}^n\alpha_iX_i\right)^2$ has chi-squared distribution with $n-1$ degrees of freedom and is independent of $\sum\alpha_iX_i$, which has standard normal distribution.