Conditional expectation of random variable given a sum

Let $(X_i)_{i\geq1}$ i.i.d in $\mathcal{L}^1(\Omega,\mathcal{F},p)$ Is it true that

$E(X_j|\sum_{i=1}^nX_i)=\frac{1}{n}\sum_{i=1}^nX_i$

For each $j$ where $1\leq j \leq n$.

I think it is true, because given the information of the sum the best forecast for $X_j$ is the mean value.

I wonder if this can be proven formally using the defining relation of the conditional expected value $(i.e \quad E(E(X|\mathcal{G})I_A)=E(XI_A))$ where $A$ is any set in $\mathcal{G}$ and $\mathcal{G}$ is a sub $\sigma-$algebra of $\mathcal{F}$.


Solution 1:

Let $S_n = \sum\limits_{i=1}^n X_i$

Since for all $j:1\leq j\leq n$, the random variables $X_j$ are independent and identically distributed, then $\mathsf E(X_j\mid S_n)$ all have the same values.   It is a matter of symmetry. $$\begin{align}\mathsf E(X_j\mid S_n) &= \tfrac 1n\sum_{i=1}^n\mathsf E(X_i\mid S_n) &&\text{Symmetry, }\forall j\in\{1..n\}\\[1ex] & = \tfrac 1n\mathsf E(\sum_{i=1}^n X_i\mid S_n) && \text{Linearity of Expectation}\\[1ex] &= \tfrac 1n \mathsf E(S_n\mid S_n) && \text{by definition of } S_n\\[1ex] & = \tfrac 1n S_n &&\text{clearly }\mathsf E(S_n\mid S_n)=S_n \\[2ex]\therefore\quad\mathsf E(X_j\mid \sum_{i=1}^n X_i) & = \tfrac 1n\sum_{i=1}^n X_i&&\text{when }{(X_j)}_{j\in\{1..n\}}\text{ are iid.} \end{align}$$ That is all you need.

Solution 2:

Let $P(x_1,x_2,\ldots,x_n)$ be the $n$-variate probability density function of $X_1,X_2,\ldots,X_n$. Then the conditional expectation value

$$E_j(S)\equiv\mathrm{E}(X_j|\textstyle\sum_{i\!}X_i=S)=\frac{\displaystyle\int x_jP(x_1,x_2,\ldots,x_n)\delta(x_1+\cdots+x_n-S)d^nx}{\displaystyle\int P(x_1,x_2,\ldots,x_n)\delta(x_1+\cdots+x_n-S)d^nx}.$$

Let's assume $P(x_1,x_2,\ldots,x_n)=P(x_2,x_3,\ldots,x_1)$ is cyclic invariant. Then this leads to

$$E_1(S)=E_2(S)=\cdots=E_n(S).$$

Also we have from the integral and delta function

$$\sum_jE_j(S)=S.$$

Therefore $E_1(S)=E_2(S)=\cdots=E_n(S)=S/n.$