Question on independence of multivariate Gaussian under orthogonal projections onto subspaces

Consider $X:=(X_{1},...,X_{n})$ multivariate Gaussian distribution, i.e. $X \sim \mathcal{N}_{n}(\mu, \Sigma).$

Now let

$P_{E_{1}}$ and $P_{E_{1}^{\perp}}$ denote the orthogonal projection onto $E_{1}$ and its orthogonal complement $E_{1}^{\perp}$, respectively.

Is it true that $P_{E_{1}}X$ and $P_{E_{1}^{\perp}}X$ are independent?

My idea:

Consider the matrix $$\overline{P}:=\begin{pmatrix} P_{E_{1}}X\\ P_{E_{1}^{\perp}}X\end{pmatrix}=\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot X$$

As a linear transformation of multivariate Gaussian, it is once again multivariate Gaussian and its distribution is given as

$$ \overline{P} \sim \mathcal{N}\left(\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \mu, \begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \Sigma\cdot \begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}^{T}\right),$$

and in our case to get independence, we are only interested in the expression

$$\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \Sigma\cdot \begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}^{T}=\begin{pmatrix} P_{E_{1}}\\ P_{E_{1}^{\perp}}\end{pmatrix}\cdot \Sigma\cdot \begin{pmatrix} P_{E_{1}}^{T}&& P_{E_{1}^{\perp}}^{T}\end{pmatrix}$$

But I am not sure whether this expression will help me get to independence. Any ideas?


If $X$ has standard normal distribution then $(X,X)$ has a bivariate normal disrtribution. Its projectios to the coordinate axes are both equal to $X$ so they are not independent.


I wonder why you call it $E_1$ rather than just $E$ when it's not distinguished from anything called $E_2.$

Jointly Gaussian random vectors are independent if they are uncorrelated. (And of course, not otherwise. That would be true regardless of whether they are Gaussian and regardless of whether they are jointly Gaussian.)

If $U\in\mathbb R^k$ and $V\in\mathbb R^\ell$ are random vectors with expected values $\mu\in\mathbb R^k$ and $\nu\in\mathbb R^\ell,$ then we define $$ \operatorname{cov}(U,V) = \operatorname E\left( (U-\mu)(V-\nu)^\top \right) \in \mathbb R^{k\times\ell}. $$ That this covariance is $0$ is what it means to say $U,V$ are uncorrelated.

Next we have $$ \operatorname{cov}\left( P_{E_1}X, P_{E_1^\bot}X \right) = P_{E_1} \Big( \operatorname{cov}(X,X)\Big) P_{E_1^\bot}^\top = P_{E_1}\Sigma P_{E_1^\bot}^\top = P_{E_1}\Sigma P_{E_1^\bot} $$ (where the last equality is because orthogonal projection matrices are symmetric).

That last matrix is zero only if every column of $\Sigma$ is orthogonal either to $E_1$ or to $E_1^\bot.$ Under the cirumstances, that means every column is a member of either $E_1^\bot$ or $E_1.$