Can two random variables $X,Y$ be dependent and such that $E(XY)=E(X)E(Y)$?

Can someone define independence of two random variables with this "product rule", or are there any counterexamples?


Solution 1:

Independence implies uncorrelation

We say that two random variables are uncorrelated if $E[XY] = E[X]E[Y]$.

If $X$ and $Y$ are independent and discrete (this can also be extended to continuous random variables), then

$$p_{X,Y}(x,y) = p_X(x)p_Y(y)\qquad \forall (x,y)$$

Using this in the definition of $E[XY]$ and doing some rearrangements of terms, we have

\begin{align} E[XY] &= \sum_x\sum_y xyp_{X,Y}(x,y)\\ &= \sum_xxp_X(x)\sum_yyp_Y(y)\\ &= E[X]E[Y] \end{align}

Therefore, if $X$ and $Y$ are independent, then they are uncorrelated.

$\\$

Uncorrelation does not imply independence (in general)

In general$^1$, it is false to say that if $X$ and $Y$ are uncorrelated, then they are independent. Let's see a counterexample. Let $X$ and $Y$ be random variables taking values in $\{(0,1), (1,0), (-1,0), (0,-1)\}$ with equal probability $0.25$. Then, $XY = 0$ with probability $1$, and therefore $E[XY] = 0$.

By other hand,

$\displaystyle p_X(x) = \left\{ \begin{array}{ll} \frac{1}{4} & x=-1\\ \frac{1}{2} & x=0\\ \frac{1}{4} & x=1\\ \end{array} \right. $ $\hspace{1cm}$ and $\hspace{1cm}$ $p_Y(y) = \left\{ \begin{array}{ll} \frac{1}{4} & y=-1\\ \frac{1}{2} & y=0\\ \frac{1}{4} & y=1\\ \end{array} \right. $

which implies that $E[X]=E[Y]=0$. Then, $E[XY] = E[X]E[Y]$ and the random variables are uncorrelated.

However,

$$p_X(1)p_Y(1) = \frac{1}{4}\frac{1}{4} \neq p_{X,Y}(1,1) = 0$$

which shows that $X$ and $Y$ are not independent.

We can conclude then that independence is a sufficient but not a necessary condition for uncorrelation.


$^1$ The only two exceptions I am aware of are the jointly Gaussian variables and two Bernoulli random variables...any other?