Conditional expectation for a sum of iid random variables: $E(\xi\mid\xi+\eta)=E(\eta\mid\xi+\eta)=\frac{\xi+\eta}{2}$
I don't really know how to start proving this question.
Let $\xi$ and $\eta$ be independent, identically distributed random variables with $E(|\xi|)$ finite.
Show that $E(\xi\mid\xi+\eta)=E(\eta\mid\xi+\eta)=\frac{\xi+\eta}{2}$
Does anyone here have any idea for starting this question?
Solution 1:
There's a subtle point here, which bothered me the first time I saw this problem.
Henry's answer has the essential idea, which is to use symmetry. User Did's comment points out that the symmetry comes from the fact that $(\xi, \eta)$ and $(\eta, \xi)$ are identically distributed. But, straight from the definition of conditional expectation, it isn't clear that symmetry in the joint distributions is enough to get the result. I ended up having to prove the following lemma:
Lemma. Let $X,Y$ be random variables. There is a measurable function $f$ such that $E[X\mid Y] = f(Y)$ a.s. Moreover, if $(X', Y')$ is identically distributed to $(X,Y)$, then $E[X' \mid Y'] = f(Y')$ a.s. for the same function $f$.
Proof. The existence of $f$ is a consequence of the Doob-Dynkin Lemma. For the second part, we use the definition of conditional expectation. $f(Y')$ is clearly $\sigma(Y')$-measurable, so it remains to show that for any $A \in \sigma(Y')$, we have $E[1_A f(Y')] = E[1_A X']$. Since $A \in \sigma(Y')$, $A = (Y')^{-1}(B)$ for some Borel set $B$ (this fact is part of the proof of Doob-Dynkin). But since $(X',Y')$ has the same distribution as $(X,Y)$, we get $$\begin{align*} E[1_A f(Y')] &= E[1_B(Y') f(Y')] \\ &= E[1_B(Y) f(Y)] \\ &= E[1_B(Y) E[X \mid Y]] \\ &= E[1_B(Y) X] && \text{since $1_B(Y)$ is $\sigma(Y)$-measurable}\\ &= E[1_B(Y') X'] \\ &= E[1_A X'] \end{align*}$$ as desired.
It is worth noting that the function $f$ is generally not unique. In particular, we could modify $f$ almost arbitrarily on any set $C \subset \mathbb{R}$ such that $P(Y \in C)=0$.
Also, to address the point in kkk's comment: Just knowing that $\xi, \eta$ are identically distributed is not sufficient. Here is a counterexample. Let $\Omega = \{a,b,c\}$ have three outcomes, each with probability $1/3$ (and $\mathcal{F} = 2^\Omega$). Let $X(a) = 0$, $X(b)=1$, $X(c)=2$; and $Y(a)=1$, $Y(b)=2$, $Y(c)=0$. Thus $X$ is uniformly distributed on $\{0,1,2\}$, and $Y = X + 1 \bmod 2$, so $Y$ is also uniformly distributed on $\{0,1,2\}$.
Now we have $(X+Y)(a) = 1$, $(X+Y)(b)=3$, $(X+Y)(c)=2$. So $X+Y$ is a 1-1 function on $\Omega$ and thus $\sigma(X+Y) = \mathcal{F}$, so both $X,Y$ are $\sigma(X+Y)$-measurable. Thus $E[X\mid X+Y]=X$, $E[Y\mid X+Y]=Y$. However, $X$, $Y$, and $\frac{X+Y}{2}$ are all different.
Solution 2:
I state in full and prove in details.
Proposition: Let $(\Omega,\mathcal{F},P$) be a probability space. Let $X,Y$ be i.i.d. random variables with $E\left[|X|\right]<\infty$. Let $\mathcal{G}=\sigma(X+Y)$. Then $\operatorname{E} \left[X \mid \mathcal{G}\right] = \operatorname{E} \left[Y \mid \mathcal{G}\right]=\frac{1}{2}(X+Y)$.
Proof: Let $\mu_{XY}$ be the joint distribution measure on $\mathbb{R}^{2}$ induced by $(X,Y)$. That is, $\mu_{XY}(B)=P\left(\left\{ \omega \mid (X(\omega), Y(\omega)) \in B\right\} \right)$. Let $\mu_X$ and $\mu_Y$ be the distribution measures on $\mathbb{R}$ induced by $X$ and $Y$ respectively. Since $X$ and $Y$ are independent, we have $\mu_{XY}=\mu_X\times\mu_Y$. Moreover, since $X$ and $Y$ are identically distributed, $\mu_X=\mu_Y$. We denote $\mu=\mu_X=\mu_Y$.
Let $A\in\mathcal{G}$ be arbitrary. There exists a Borel set $B\subseteq\mathbb{R}$ such that $A=(X+Y)^{-1}(B)$. Hence $1_{A}(\omega)=1_{B}(X(\omega)+Y(\omega))$ for any $\omega\in\Omega$.
We have \begin{align} & \int_A \operatorname{E}\left[X\mid\mathcal{G}\right]\,dP = \int_A X\,dP=\int 1_B(X+Y)X \, dP = \int 1_B(x+y)x\,d\mu_{XY}(x,y) \\[10pt] = {} & \iint1_{B}(x+y)x\,d\mu_{X}(x) \, d\mu_Y(y) = \iint 1_B(x+y)x \, d\mu(x) \, d\mu(y). \end{align} By the same argument, $$ \int_A \operatorname{E}\left[Y\mid\mathcal{G}\right]\,dP=\iint1_{B}(x+y)y \, d\mu(x) \, d\mu(y). $$ Now it is clear that $$ \int_A \operatorname{E}\left[X\mid\mathcal{G}\right]\,dP=\int_A \operatorname{E} \left[Y\mid\mathcal{G}\right] \,dP $$ and hence $\operatorname{E} \left[X \mid \mathcal{G}\right] = \operatorname{E}\left[Y \mid \mathcal{G}\right]$. Lastly, $\operatorname{E}\left[X+Y\mid\mathcal{G}\right]=X+Y$. It follows that $\operatorname{E}\left[X\mid\mathcal{G}\right]=\operatorname{E} \left[Y \mid \mathcal{G} \right]=\frac 1 2 (X+Y)$.