How can I show that $X$ and $Y$ are independent and find the distribution of $Y$?

Solution 1:

The variable $X$ is the sample mean and the variable $Y$ is the sample variance times $(n-1)$. So Basu's theorem implies that they are independent.

The distribution of $Y$ is $\chi^2_{n-1}$ as the sum of the squares of the $n$ iid normal random variables $X_i-X$, (where $X$ is used and so there are $n-1$ degrees of freedom instead of $n$).

Solution 2:

${\bf X}=(X_1,\dots, X_n)^\prime$ has a multivariate normal distribution with $\mu_{\bf X}=\mu {\bf 1}$ and $\Sigma_{\bf X}=\sigma^2 I$. Here ${\bf 1}$ is the column vector of all $1$s, while $I$ is the $n\times n$ identity matrix.

Let ${\bf e}_1=(1,0,0,\dots,0)^\prime $, and let $A$ be the matrix of an orthogonal transformation that takes the vector $\bf 1$ into the vector $\sqrt{n}\, {\bf e}_1$.

The vector ${\bf U}=A{\bf X}$ is multivariate normal with $\mu_{\bf U}=\mu \sqrt{n}\, {\bf e}_1 $ and $\Sigma_{\bf U}=\sigma^2 I$. In particular, the random variables $U_1,U_2,\dots, U_n$ are independent.

The first coordinate of the random vector $\bf U$ is $$U_1=(A{\bf X})^\prime{\bf e}_1={\bf X}^\prime A^\prime {\bf e}_1= {1\over \sqrt{n}}\, {\bf X}^\prime A^\prime A{\bf 1} = {1\over \sqrt{n}}\, {\bf X}^\prime {\bf 1}=\sqrt{n}\,X.$$

Also, $$\sum_{i=1}^n X^2_i={\bf X}^\prime {\bf X}= {\bf X}^\prime A^\prime A {\bf X}={\bf U}^\prime{\bf U} =n X^2+\sum_{i=2}^n U_i^2,$$ so that $$\sum_{i=1}^n (X_i-X)^2 =\sum_{i=1}^n X^2_i-nX^2=\sum_{i=2}^n U_i^2.$$

The independence of $U_1$ and $\sum_{i=2}^n U_i^2$ implies the independence of $X$ and $\sum_{i=1}^n (X_i-X)^2$.