Does "independence" of moments imply independence?

Suppse you have two random variables $X,Y$ and you are given that for any $m,n$ that:

$$E(X^n Y^m) = E(X^n)E(Y^m)$$

Does this imply that $X$ and $Y$ are independent? Are there some condtions on how fast the moments grow can be added to help?

Attempt at solution: I know that if the characteristic functions split like $E(e^{i(X,Y)\cdot(s,t)}) = E(e^{iXs})E(e^{iYt})$, (where $.$ is the Schur product) then the RV's are independent. I would try to approximate this by the moments. However, I think you might need some condition that the moments don't grow too fast to make this work.


The answer is yes if the series $$\sum_{m=1}^{+\infty}\frac{t^m}{m!}\mathbb E|X|^m\quad\mbox{ and }\quad\sum_{n=1}^{+\infty}\frac{t^n}{n!}\mathbb E|Y|^n$$ are convergent for each $t$ (it follows from dominated convergence theorem which gives the splitting equality mentioned in the OP). It's in particular the case when $X$ and $Y$ are bounded.


The general answer to your question is, no: "moment independence" does not imply independence of the r.v.'s involved. Assume that $$ E(X^n Y^m) = E(X^n)E(Y^m) \Rightarrow \int_{S_x}\int_{S_y}x^ny^mf_{XY}(x,y)dydx = \int_{S_x}x^nf_X(x)dx\int_{S_y}y^mf_Y(y)dy $$

$$\Rightarrow \int_{S_x}\int_{S_y}x^ny^m\left[f_{XY}(x,y)-f_X(x)f_Y(y)\right]dydx =0$$

In principle, this iterated integral can be zero while $f_{XY}(x,y)-f_X(x)f_Y(y)\neq 0$ at the same time.

The usual difficulty is to find specific examples of random variables such that they exhibit certain "independence" traits, but they are not independent. Still, this does not change the result.