Is there any counterexample to show that $X,Y$ are two random variables and $E(X\mid Y)=E(X)$, but $X$ and $Y$ are not independent.
Is there any counterexample to show that $X,Y$ are two random variables and $E(X\mid Y)=E(X)$, but $X$ and $Y$ are not independent.
I already know that if $X,Y$ are independent, then $E(X\mid Y)=E(X)$.
Solution 1:
\begin{align} X & = \begin{cases} -1 \\ \phantom{-}0 & \text{each with probability } 1/3 \\ +1 \end{cases} \\[10pt] Y & = X^2 \end{align} Then $\operatorname E(X\mid Y=0) = 0$ and $\operatorname E(X\mid Y=1) = 0,$ so $\operatorname E(X\mid Y) = 0$ with probability $1.$ But $X$ and $Y$ are very far from independent.
Solution 2:
Let $X,Y$ have an uniform density over the diamond $(0,1),(1,0),(0,-1),(-1,0)$ - or over the unit circle.
Then $X,Y$ are not independent [*], but $E(X \mid Y) = E(X )=0$
For another example, you could take the triangle $(-1,0),(0,1),(1,0)$. In this case $E(X \mid Y) = E(X)$ but $E(Y \mid X) \ne E(Y)$
Notice BTW that $E(X \mid Y) = E(X) \implies E(X Y) = E(X)E(Y)$ (uncorrelated), but not the reverse.
[*] If the marginals have support over finite intervals, then the support of the joint density must be a rectangle for the variables to be independent. Put in other way, if for some point $f_{X,Y}(x_0,y_0)=0$ but $f_X(x_0)>0$ and $f_Y(y_0)>0$ then the variables cannot be independent.
Solution 3:
Consider a two-component mixture Gaussian model with \begin{align} & X \mid Y = 0 \sim N(0, 1), \\ & X \mid Y = 1 \sim N(0, 4), \\ & P(Y = 0) = P(Y = 1) = 1/2. \end{align}
Clearly $E(X \mid Y = 0) = E(X \mid Y = 1) = E(X) = 0$. But $X$ and $Y$ are not independent. Because on one hand, $$P(-1 \leq X \leq 1, Y = 0) = P(-1 \leq X \leq 1 \mid Y = 0) P(Y = 0) = \Phi(1) - 0.5.$$ On the other hand, \begin{align} P(-1 \leq X \leq 1) & = P(-1 \leq X \leq 1 \mid Y = 0)P(Y = 0) + P(-1 \leq X \leq 1 \mid Y = 1)P(Y = 1) \\ &= \Phi(1) - 0.5 + \Phi(0.5) - 0.5 = \Phi(1) + \Phi(0.5) - 1. \end{align} Therefore $$P(-1 \leq X \leq 1)P(Y = 0) = 0.5\Phi(1) + 0.5\Phi(0.5) - 0.5 \neq P(-1 \leq X \leq 1, Y = 0),$$ i.e., $X$ and $Y$ are not independent.
Per Clement's request, let me add more details to show that the random variable $E(X \mid Y)$ is actually the constant $0$. In fact, as a random variable, $Z(\omega) = E[X|Y](\omega)$ are constants on the the sets $\{\omega: Y(\omega) = 1\}$ and $\{\omega: Y(\omega) = 0\}$ (this is a general result holds for any conditional expectation of form $E[X | Y]$, see for example, Theorem 9.1.2. of A Course in Probability Theory). As the union of these two sets is the whole sample space $\Omega$, it follows that $E[X \mid Y] \equiv 0$.
To avoid above technicalities which needs advanced probability theory, you may construct a discrete counterexample using the same idea. How about a $2$-by-$4$ table? The columns are possible values of $X$ while the rows are possible values of $Y$.
\begin{array}{c|c c c c} & -1 & 1 & -2 & 2 \\ \hline 0 & 1/4 & 1/4 & 0 & 0 \\ 1 & 0 & 0 & 1/4 & 1/4 \end{array}