Computing the expectation of conditional variance in 2 ways
Same as here. Let $X$ be a square integrable random variable on $(\Omega,\mathcal{F},P)$. Let $\mathcal{G}$ be a sub-$\sigma$-algebra of $\mathcal{F}$. Define the conditional variance of $X$ given $\mathcal{G}$ by $$\operatorname{var}(X\mid\mathcal{G})=\Bbb{E}[(X-\Bbb{E}[X\mid\mathcal{G}])^2\mid\mathcal{G}]$$ Prove the formula
$$\operatorname{var}(X)=\operatorname{var}(\Bbb{E}[X\mid\mathcal{G}])+\Bbb{E}[\operatorname{var}(X\mid\mathcal{G})]$$
1 I don't understand part of the presented solution.
2 Is my attempt also correct?
3 Alternate solutions?
1 I don't understand this part of the solution wherein
$$\operatorname{var}(X\mid \mathcal{G})=\Bbb{E}[(X-\Bbb{E}[X\mid\mathcal{G}])^2 \mid \mathcal{G}]$$
$ =\mathbb E[ X^2 \mid \mathcal G] - \mathbb E [X \mid \mathcal G]^2 $. I think it is making use of the fact that $\mathbb E [X \mid \mathcal G]$ is $\mathcal G$-measurable in saying:
$\mathbb E[XE[X\mid\mathcal G]\mid\mathcal G] = \mathbb E[X\mid\mathcal G] \ E[X\mid\mathcal G] = E[X\mid\mathcal G]^2$, but isn't $E[X\mid\mathcal G]$ supposed to be bounded? How to show this?
My attempt:
$E(X^2) < \infty$
$\to E(X) < \infty$
$\to E(X\mid\mathcal G) < \infty$, but I don't think this step is correct.
2 My own attempt: I tried evaluating $$E(\operatorname{var}(X\mid\mathcal{G}))=E(\Bbb{E}[(X-\Bbb{E}[X\mid\mathcal{G}])^2\mid\mathcal{G}])$$
Is it correct to say that $E(\Bbb{E}[(X-\Bbb{E}[X\mid\mathcal{G}])^2\mid\mathcal{G}]) = E([(X-\Bbb{E}[X\mid\mathcal{G}])^2])$?
After expanding the square I encounter $E[XE[X\mid\mathcal G]]$ and rewrite
$= E[E[XE[X\mid\mathcal G]]\mid\mathcal G]$
$ = E[E[X\mid\mathcal G] \ E[X\mid\mathcal G]]$ (*)
$ = E[E[X\mid\mathcal G]^2]$
(*) Assuming this is all correct, I am still unsure about this part. Is it really bounded?
3 Is there a way to go about this without saying $E[X\mid\mathcal G]$ is bounded?
4 Is this wrong? It's from Stochastic Calculus class notes, though the problem is for an Advanced Probability class (and the notes don't say anything about bounded weirdly).
Solution 1:
-
What is used in this solution is called pull out and works for any integrable random variable (it does not need to be bounded):
Let $X \in L^1$ and $Y \in L^1(\mathcal{G})$ (i.e. integrable and $\mathcal{G}$-measurable) such that $X \cdot Y \in L^1$. Then $$\mathbb{E}(X Y \mid \mathcal{G}) = Y \cdot \mathbb{E}(X \mid \mathcal{G}).$$
Proof: Let us assume that we know the result for bounded $Y$, i.e. if there exists $n$ such that $|Y(\omega)| \leq n$ for all $\omega \in \Omega$. For $Y \in L^1(\mathcal{G})$, we define approximations $$Y_n := (-n) \vee Y \wedge n.$$ Then, by definition, $Y_n$ is $\mathcal{G}$-measurable and bounded. Moreover, $Y_n \to Y$ almost surely and $|XY_n| \leq |XY|$. Therefore, it follows from the (conditional) dominated convergence theorem that $$\begin{align*} \mathbb{E}(X \cdot Y \mid \mathcal{G}) &= \lim_{n \to \infty} \mathbb{E}(X Y_n \mid \mathcal{G}) \\ &= \lim_{n \to \infty} Y_n \mathbb{E}(X \mid \mathcal{G}) \\ &= Y \mathbb{E}(X \mid \mathcal{G}).\end{align*}$$ This finishes the proof. Regarding your own thoughts: $\mathbb{E}(X^2) < \ \infty$ implies that $\mathbb{E}(X \mid \mathcal{G})< \infty$ almost surely, but this is something different from $\mathbb{E}(X \mid \mathcal{G})$ being bounded. $\mathbb{E}(X \mid \mathcal{G})$ is bounded if, and only if, we can find $n \in \mathbb{N}$ such that $$|\mathbb{E}(X \mid \mathcal{G})(\omega)| \leq n$$ for all $\omega \in \Omega$ (the bound does not depend on $\omega$).
- First of all: Yes, the equality $$\mathbb{E}\big( \mathbb{E}(Y^2 \mid \mathcal{G}) \big) = \mathbb{E}(Y^2)$$ holds for any square integrable random variable $Y$ (by tower property). Nevertheless, your attempt fails: You are trying to show that the expectations of the random variables $\text{var}(X \mid \mathcal{G})$ and $\text{var}(\mathbb{E}(X \mid \mathcal{G}))+\mathbb{E}(\text{var}(X \mid \mathcal{G}))$ coincide. But this does not imply that the random variables are the same.
- Yes, see the first part.