How can I show that the conditional expectation $E(X\mid X)=X$?
I tried to show that $E(X\mid X=x)=x$, which would lead me to get $E(X\mid X)=X$ but I am having trouble doing so. I know that the definition of conditional expectation (continuous case) is: $$E(X\mid Y=y)=\int_{-\infty}^{\infty}x f_{X\mid Y}(x\mid y)\,dx$$ This led me to write: $$E(X\mid X=x)=\int_{-\infty}^{\infty}x f_{X\mid X}(x\mid x)\,dx$$ but I don't know what $f_{X\mid X}(x\mid x)$ is (if that's even a valid notation...)
I do note though that I don't know anything about $\sigma$-fields or Borel sets, as someone had tried to explain me once. Is it possible to prove this in an "elementary" way?
Solution 1:
It is actually easier to define $E(X\mid Y)$ than $E(X\mid Y=y)$ hence let us recall how this is done. By definition, $E(X\mid Y)$ is any random variable $Z$ such that:
$\qquad$ (1.) $Z$ is a measurable function of $Y$.
$\qquad$ (2.) For every bounded random variable $v(Y)$, $E(Zv(Y))=E(Xv(Y))$.
It happens that when $X$ is integrable, such a random variable $Z$ exists and is almost surely unique, in the sense that if $Z$ and $Z'$ both satisfy conditions (1.) and (2.), then $P(Z=Z')=1$.
When $X=Y$, things are quite simple since $Z=X$ obviously satisfies conditions (1.) and (2.), thus, $E(X\mid X)=X$.
Note finally that $Z=X$ always satisfies (2.) but does not satisfy (1.) in general, and, likewise, that $Z=E(X)$ always satisfies (1.) but does not satisfy (2.) in general.