Are functions of independent variables also independent?

Yes, they are independent.

If you are studying rigorous probability course with sigma-algebras then you may prove it by noticing that the sigma-algebra generated by $f_{1}(X_{1})$ is smaller than the sigma-algebra generated by $X_{1}$, where $f_{1}$ is borel-measurable function.

If you are studying an introductory course - then just remark that this theorem is consistent with our intuition: if $X_{1}$ does not contain info about $X_{2}$ then $f_{1}(X_{1})$ does not contain info about $f_{2}(X_{2})$.


For any two (measurable) sets $A_i$, $i=1,2$, $Y_i \in A_i$ if and only if $X_i \in B_i$, where $B_i$ are the sets { $s : f_i (s) \in A_i$ }. Hence, since the $X_i$ are independent, ${\rm P}(Y_1 \in A_1 , Y_2 \in A_2) = {\rm P}(Y_1 \in A_1) {\rm P}(Y_2 \in A_2)$. Thus, the $Y_i$ are independent (which is intuitively clear anyway). [We have used here that random variables $Z_i$, $i=1,2$, are independent if and only if ${\rm P}(Z_1 \in C_1 , Z_2 \in C_2) = {\rm P}(Z_1 \in C_1) {\rm P}(Z_2 \in C_2)$ for any two measurable sets $C_i$.]


Yes, they are independent.

The previous answers are sufficient and rigorous. On the other hand, it can be restated as followed. Assume they are discrete random variable.

$\Pr[Y_1 = f_1(X_1) \wedge Y_2 = f_2(X_2)] = \Pr[X_1 \in f_1^{-1}(Y_1)\wedge X_2\in f_2^{-1}(Y_2)] = \Pr[X_1 \in A_1 \wedge X_2 \in A_2]$

and we expand it by probability mass function derived

$ = \sum_{x_1 \in A_1\wedge x_2 \in A_2}\Pr(x_1, x_2) = \sum_{x_1 \in A_1\wedge x_2 \in A_2}\Pr(x_1)\Pr(x_2) $

Here we use the independency of $X_1$ and $X_2$, and we shuffle the order of summation

$= \sum_{x_1 \in A_1}\Pr(x_1)\cdot \sum_{x_2 \in A_2} \Pr(x_2) = \Pr[X_1\in f_1^{-1}(Y_1)]\cdot \Pr[X_2 \in f_2^{-1}(Y_2)] = \Pr[Y_1 = f_1(X_1)]\Pr[Y_2 = f_2(X_2)] $

Here we show the function of independent random variable is still independent


I'll add another proof here, the continuous analog of Fang-Yi Yu's proof:

Assume $Y_1$ and $Y_2$ are continuous. For real numbers $y_1$ and $y_2$, we can define:

$S_{y_1} = \{{x_1: g(x_1)\le y_1} \}$ and

$S_{y_2} = \{{x_2: h(x_2)\le y_2} \}$.

We can then write the joint cumulative distribution function of $Y_1$ and $Y_2$ as:

\begin{eqnarray*} F_{Y_{1},Y_{2}}(y_{1},y_{2}) & = & P(Y_{1}\le y_{1},Y_{2}\le y_{2})\\ & = & P(X_{1}\in S_{y_{1}},X_{2}\in S_{y_{2}})\\ & = & P(X_{1}\in S_{y_{1}})P(X_{2}\in S_{y_{2}}) \end{eqnarray*}

Then the joint probability density function of $Y_{1}$ and $Y_{2}$ is given by:

\begin{eqnarray*} f_{Y_{1},Y_{2}}(y_{1},y_{2}) & = & \frac{\partial^{2}}{\partial y_{1}\partial y_{2}}F_{Y_{1},Y_{2}}(y_{1},y_{2})\\ & = & \frac{d}{dy_{1}}P(X_{1}\in S_{y_{1}})\frac{d}{dy_{2}}P(X_{2}\in S_{y_{2}}) \end{eqnarray*}

Since the first factor is a function only of $y_{1}$ and the second is a function only of $y_{2}$, then we know $Y_{1}$ and $Y_{2}$ are independent (recall that random variables $U$ and $V$ are independent random variables if and only if there exists functions $g_{U}(u)$ and $h_{V}(v)$ such that for every real $u$ and $v$, $f_{U,V}(u,v)=g_{U}(u)h_{V}(v)$).