Is there a *simple* example showing that uncorrelated random variables need not be independent?
Is there a simple example showing that given $X,Y$ uncorrelated (covariance is zero), $X,Y$ are not independent?
I have looked up two references, however, I am dissatisfied with both.
In Reference $1$, $X,Y$ are assumed to be independent uniform RVs from $(0,1)$, construct $Z = X+Y, W = X - Y$, then the claim is that $Z,W$ is uncorrelated but not independent. Unfortunately, finding the PDF of $Z,W$ is not trivial.
In Reference $2$, $\phi$ is assumed to be uniform RV from $(0, 2\pi)$, and construct $X = \cos(\phi)$, $Y = \sin(\phi)$. Then the claim is that $X,Y$ are uncorrelated but not independent. Unfortunately, the PDFs of $X,Y$ takes on the form of rarely mentioned arcsine distribution.
I just wish to have an example at hand where I can whip out to show that uncorrelated does not necessarily implies independent. Is this do-able?
Here's a (perhaps) simpler example. Let $X$ be $N(0,1)$ and $Y = X^2.$ Then $$ E(XY) = E(X^3) = 0 =E(X)E(Y),$$ so $X$ and $Y$ are uncorrelated, but clearly they aren't independent (if you know $X$, then you know $Y).$
Two fair coins are tossed independently; the first has sides labelled $0$ and $1,$ the second has sides labelled $1$ and $-1.$ Let $X$ be the number that comes up on the first coin, and let $Y$ be the product of the two numbers that come up.
The variables $X$ and $Y$ are uncorrelated: since $XY=Y,$ $$E(XY)=E(Y)=0=\frac12\cdot0=E(X)E(Y).$$ The variables $X$ and $Y$ are not independent: $$P(X=0,Y=0)=P(X=0)=\frac12\ne\frac12\cdot\frac12=P(X=0)P(Y=0).$$
How about $(X,Y)$ taking values $(1,0)$, $(-1,0)$, $(0,1)$ and $(0,-1)$ each with probability $1/4$? Then $E(X)=E(Y)=0$ and $XY=0$, so the covariance is zero, but $X$ and $Y$ are not independent.