Proving independence of $N(\mu,\sigma^2)$ sample mean $\bar X$ and variance $S^2$ by change of variables

Let $X_1,...,X_n$ are iid sample from $N(\mu,\sigma^2)$. Then $\bar X$ and $S^2$ are independent.

I was stuck on proving above statement.

The joint PDF of $(X_1, ... ,X_n)$ is given by

$$f(x_1,...,x_n)=\frac{1}{\sqrt {2\pi\sigma^2}}exp \bigg[-\frac{\sum_{i=1}^{n}(x_i-\mu)^2}{2\sigma^2}\bigg]$$

$$=\frac{1}{\sqrt {2\pi\sigma^2}}exp\biggl[-\frac{1}{2\sigma^2}\biggl\{\sum_{i=1}^{n}(x_i-\bar x_n)^2+n(\mu-\bar x_n)^2\biggl\}\biggl] $$

Now, consider the following transformation

$y_i=\bar x_n$ and $ y_i=x_i-\bar x_n, i=2,3,...,n$

then $x_1-\bar x_n = -\sum_{i=1}^{n}(x_i-\bar x_n)=-\sum_{i=1}^{n}y_i$

Thus $\sum_{i=1}^{n}(x_i-\bar x_n)^2=\biggl(-\sum_{i=1}^{n}y_i\biggr)^2+\sum_{i=1}^{n}y_i^2$

The joint PDF of $y_1,...,y_n$ is given by $$f(y_1,...,y_n)=J\Biggl(\frac{1}{\sqrt {2\pi\sigma^2}}\Biggr)^n exp\Biggl[\frac{1}{2\sigma^2}\Biggl\{\Biggl(\sum_{i=1}^{n}y_i\Biggr)^2+\sum_{i=1}^{n}y_i^2+n(y_1-\mu)^2\Biggr\}\Biggr]$$

$$=g(y_2,..,y_n)h(y_1)$$

,where $J$ denotes the Jacobian, $g(y_2,..,y_n)$ is joint PDF of $y2,...,y_n$ and $h(y_1)$is marginal PDF of $Y_1$

I don't understand how the joint PDF of $y_1,...y_n$ could be broken into such two part. I guess $E(Y_1)=\mu, Var(y_1)=\sigma^2$ such that $Y_1$ follows $N(0,\sigma^2)$. So, I guess rear part of exponential, $\frac{J}{\sqrt {2\pi\sigma^2}} exp\Biggl[\frac{n(y_1-\mu)^2}{2\sigma^2}\Biggl]$, means $h$. But, I'm not sure because of multiple of $n$. Further, I don't know how $g$ could be derived from that front part of the exponential. Please give me a hint!


Actually Var$(Y_1)=\text{Var}(\overline X_n)=\sigma^2/n$ which explains the multiple of $n$.


If you are trying to prove the independence of $\overline X$ and $S^2$ via change of variables, suggest you using an orthogonal transformation for ease of calculation. The result can be proved without finding the joint pdf of $(\overline X,S^2)$ and showing they are independent because the joint pdf factors as the product of two marginals.

Consider the transformation $$(X_1,X_2,\ldots,X_n)\to(Y_1,Y_2,\ldots,Y_n)$$ such that $$\begin{pmatrix}Y_1\\Y_2\\\vdots\\Y_n\end{pmatrix}=Q\begin{pmatrix}X_1\\X_2\\\vdots\\X_n\end{pmatrix}$$

, where $Q$ is an $n\times n$ orthogonal matrix with the first row $$\left(\frac{1}{\sqrt{n}},\frac{1}{\sqrt{n}},\ldots,\frac{1}{\sqrt{n}}\right)$$

Then, $$Y_1=\frac{1}{\sqrt{n}}\sum_{i=1}^n X_i=\sqrt{n}\overline X \quad\text{ and }\quad\sum_{i=1}^n Y_i^2=\sum_{i=1}^n X_i^2$$

Clearly, $$(X_1,X_2,\ldots,X_n)\in\mathbb R^n\implies (Y_1,Y_2,\ldots,Y_n)\in\mathbb R^n $$

The absolute value of the Jacobian determinant is $$|J|=\frac{1}{|\det Q|}=1$$

Further,

\begin{align} \sum_{i=1}^n (x_i-\mu)^2&=\sum_{i=1}^n x_i^2-2n\bar x\mu+n\mu^2 \\&=\sum_{i=1}^n y_i^2-2\sqrt{n}y_1\mu+n\mu^2 \\&=(y_1-\sqrt{n}\mu)^2+\sum_{i=2}^n y_i^2 \end{align}

So joint pdf of $(Y_1,Y_2,\ldots,Y_n)$ is of the form

\begin{align} f_{Y_1,\ldots,Y_n}(y_1,\ldots,y_n)&=\frac{1}{(\sigma\sqrt{2\pi})^n}\exp\left[-\frac{1}{2\sigma^2}\left\{(y_1-\sqrt{n}\mu)^2+\sum_{i=2}^n y_i^2\right\}\right] \\&=\frac{1}{\sigma\sqrt{2\pi}}\exp\left[-\frac{1}{2\sigma^2}(y_1-\sqrt{n}\mu)^2\right]\,\prod_{j=2}^n \left\{ \frac{1}{\sigma\sqrt{2\pi}}\exp\left(-\frac{y_j^2}{2\sigma^2}\right)\right\} \end{align}

It is now clear that $Y_1,Y_2,\ldots,Y_n$ are independently distributed with

$$Y_1\sim\mathcal N(\sqrt{n}\mu,\sigma^2)\quad \text{ and }\quad Y_j\sim\mathcal N(0,\sigma^2)\,,\quad j=2,3,\ldots,n$$

While we get the distribution of $\overline X$ from $Y_1$, we get the distribution of $S^2$ from $Y_2,Y_3,\ldots,Y_n$.

$\overline X$ and $S^2$ are independently distributed precisely because $Y_1$ is independent of $Y_2,\ldots,Y_n$.

Noting that

\begin{align} \sum_{i=2}^n Y_i^2&=\sum_{i=1}^n Y_i^2-Y_1^2 \\&=\sum_{i=1}^n X_i^2-n\overline X^2 \\&=\sum_{i=1}^n (X_i-\overline X)^2 \\&=(n-1)S^2 \end{align}

, we have $$\frac{(n-1)S^2}{\sigma^2}\sim \chi^2_{n-1}$$

And from $Y_1$ we already had $$\overline X\sim \mathcal N\left(\mu,\frac{\sigma^2}{n}\right)$$