Is this condition enough to determine a random variable?

For two positive random variables X,Y, we know that if $E[X^r]=E[Y^r]$ for any r holds, the cdf of X and Y can still be different.

Then it occur to me what about changing $r\in \mathbb{N}$ to $r\in \mathbb{R+}$?

Is this enough to determine a r.v.?


Your condition does indeed determine the distribution uniquely if we assume $E[X^{r_0}] < \infty$ for some $r_0 > 0$. Otherwise, it is easy to find counterexamples, since the condition $E[X^r] = \infty = E[Y^r]$ for all $r>0$ does not tell us much. Actually, as the proof below shows, we then only need $E[X^r] = E[Y^r]$ for $0<r<r_0$.

Under the above assumption, define $$ \Phi : \{z \in \Bbb{C} \,\mid \, 0 < {\rm{Re}}(z) < r_0 /2 \} \to \Bbb{C}, z \mapsto E[X^{z}] = E[e^{z \cdot \ln X}]. $$

Using differentiation under the integral sign as in Difference of differentiation under integral sign between Lebesgue and Riemann, together with the estimate \begin{eqnarray*} \left|\frac{{\rm d}}{{\rm d}z}e^{z\cdot\ln X}\right| & = & \left|\ln X\cdot e^{z\cdot\ln X}\right|\\ & = & \left|\ln X\cdot e^{{\rm Re}\left(z\right)\cdot\ln X}\right|\\ & = & \left|\ln X\right|\cdot X^{{\rm Re}\left(z\right)}\\ & \leq & \begin{cases} \ln X\cdot X^{r_{0}/2}\leq C_{r_{0}}\cdot X^{r_{0}}, & \text{if }X\geq1,\\ \left|\ln X\right|\cdot X^{1/n}=\left|f_{n}\left(x\right)\right|\leq C_{n}, & \text{if }X<1 \end{cases} \end{eqnarray*} which is valid on each of the sets $M_n := \{z \in \Bbb{C} \,\mid\, 1/n < {\rm Re}(z) < r_0/2\}$, we see that $\Phi$ is holomorphic on each $M_n$ and hence on its domain which equals $\bigcup_n M_n$.

In the estimate above, I used the estimate $\ln x \leq C_{r_0} x^{r_0}$ which is valid for some constant $C_{r_0}$ for all $x \geq 1$, since the logarithm grows more slowly than any positive power of $x$. Furthermore, I used that the function $$ f_n : [0,1] \to \Bbb{R}, x \mapsto \begin{cases} x^{1/n} \cdot \ln x & x > 0 \\ 0,& x=0\end{cases} $$ is continuous by L'Hospital since $x^{1/n} \cdot \ln x = \frac{\ln x}{x^{-1/n}}$ with $x^{-1/n} \to \infty$ and $\ln x \to -\infty$ as $x \downarrow 0$. Finally, $$ \frac{\frac{{\rm d}}{{\rm d}x}\ln x}{\frac{{\rm d}}{{\rm d}x}x^{-1/n}}=\frac{\frac{1}{x}}{\left(-1/n\right)\cdot x^{-1-1/n}}=-n\cdot x^{1/n}\to0 \text{ as } x \downarrow 0. $$

Using the same arguments, we see that the analogous function with $Y$ in place of $X$ is also holomorphic. By the https://en.wikipedia.org/wiki/Identity_theorem for holomorphic functions (note that the domain of $\Phi$ is connected and that the two functions coincide (by assumption) on the set $(0, r_0/2)$ which has an accumulation point), this implies $E[X^z] = E[Y^z]$ for all $z \in \Bbb{C}$ with ${\rm Re}(z) < r_0 /2$.

By dominated convergence, it is easy to see that this extends to ${\rm Re}(z) = 0$. Thus, $$ E[e^{2\pi i t \ln X}]=E[X^{2\pi i t}] = E[Y^{2\pi i t}] = E[e^{2 \pi i t \cdot \ln Y}] $$ holds for all $t \in \Bbb{R}$, so that the Fourier transforms of $\ln X$ and $\ln Y$ agree. Hence, the distributions of $\ln X, \ln Y$ are identical and so are those of $X,Y$.