When linear combinations of independent random variables are still independent?
Solution 1:
If $A = a_1 \oplus a_2 \oplus \cdots \oplus a_m$, for $m \leq n$, where $a_i$ are row vectors of dimension $n_i$ such that $\sum_{i=1}^m n_i = n$ and $\oplus$ denotes the direct sum, then the random vector $Y$ has independent coordinates.
This is not hard to see since $Y_1$ is measurable with respect to $\sigma(X_1, \ldots X_{n_1})$, $Y_2$ is measurable with respect to $\sigma(X_{n_1+1}, \ldots, X_{n_1+n_2})$, etc., and these $\sigma$-algebras are independent since the $X_i$ are independent (essentially, by definition).
Obviously, this result still holds if we consider matrices that are column permutations of the matrix $A$ described above. Indeed, as we see below, in the case where the distribution of each $X_i$ is non-normal (though perhaps depending on the index $i$), this is essentially the only form that $A$ can take for the desired result to hold.
In the normal-distribution case, as long as $A A^T = D$ for some diagonal matrix $D$, then the coordinates of $Y$ are independent. This is easily checked with the moment-generating function.
Suppose $X_1$ and $X_2$ are iid with finite variance. If $X_1 + X_2$ is independent of $X_1 - X_2$, then $X_1$ and $X_2$ are normal distributed random variables. See here. This result is known as Bernstein's theorem and can be generalized (see below). A proof can be found in Feller or here (Chapter 5).
In the case where $A$ cannot be written as a direct sum of row vectors, you can always cook up a distribution for $X$ such that $Y$ does not have independent coordinates. Indeed, we have
Theorem (Lukacs and King, 1954): Let $X_1, X_2, \cdots, X_n$ be $n$ independently (but not necessarily identically) distributed random variables with variances $\sigma_i^2$, and assume that the $n$th moment of each $X_i(i = 1, 2, \cdots, n)$ exists. The necessary and sufficient conditions for the existence of two statistically independent linear forms $Y_1 = \sum^n_{i=1} a_i X_i$ and $Y_2 = \sum^n_{i=1} b_i X_i$ are
- Each random variable which has a nonzero coefficient in both forms is normally distributed, and
- $\sum^n_{i=1} a_i b_i \sigma^2_i = 0$.
Solution 2:
The Skitovich-Darmois theorem A (Skitovich(1953), Darmois(1953), see also A. Kagan, Yu. Linnik, and C.R. Rao (1973, Ch.3)). Let $\xi_j$, where $j=1, 2,\dots, n,$ and $n\geq 2$, be independent random variables. Let $\alpha_j, \beta_j$ be nonzero constants. If the linear statistics $L_1=\alpha_1\xi_1+\cdots+\alpha_n\xi_n$ and $L_2=\beta_1\xi_1+\cdots+\beta_n\xi_n$ are independent, then all random variables $\xi_j$ are Gaussian.