$Y= X+N$ what is ${\rm E}(X|Y=y)$

$Y = X+N$ where $X,N$ are normally distributed with zero mean.

I know that $Y$ is also normally distributed with zero mean and sum of variances.

I don't know how to get $f(X|y)$ in order to calculate ${\rm E}(X|y)$.


Solution 1:

You can find ${\rm E}(X|Y=y)$ immediately as follows. If $X \sim {\rm N}(0,\sigma_1^2)$ and $N \sim {\rm N}(0,\sigma_2^2)$ (and $X$ and $N$ are independent), then $$ {\rm E}(X|X + N) = \frac{{\sigma _1^2 }}{{\sigma _1^2 + \sigma _2^2 }}(X + N), $$ so, in particular, $$ {\rm E}(X|X + N = y) = \frac{{\sigma _1^2 }}{{\sigma _1^2 + \sigma _2^2 }}y. $$ I will (probably) justify this (i.e., the first formula) later on today.

The point is that, in some respect, the normal distribution does not play a special role in this result. For example, if $X$ and $N$ are independent Poisson random variables with means $\lambda_1$ and $\lambda_2$, respectively, then $$ {\rm E}(X|X + N) = \frac{{\lambda _1 }}{{\lambda_1 + \lambda_2 }}(X+N). $$

What is common to the normal distribution and the Poisson distribution in this context is that both are infinitely divisible. More details later on.

Solution 2:

For some later purposes, I give here the complete solution based on svenkatr's approach (thus also verifying my previous answer). In what follows, I use a different (somewhat more natural) notation, so don't get confused.

Suppose that $X$ and $Y$ are independent ${\rm N}(0,\sigma_1^2)$ and ${\rm N}(0,\sigma_2^2)$ variables, respectively. We can find ${\rm E}(X|X+Y=z)$ as follows. $$ {\rm E}(X|X + Y = z) = \int_{ - \infty }^\infty {xf_{X|X + Y} (x|z)\,{\rm d}x} = \int_{ - \infty }^\infty {x\frac{{f_X (x)f_{X + Y|X} (z|x)}}{{f_{X + Y} (z)}}\, {\rm d}x} $$ (the notation should be clear from the context). Noting that $f_{X+Y|X}(\cdot|x)$ is the ${\rm N}(x,\sigma_2^2)$ density function, some algebra shows that the right-hand side integral is equal to $$ \int_{ - \infty }^\infty {x\sqrt {\frac{{\sigma _1^2 + \sigma _2^2 }}{{\sigma _1^2 \sigma _2^2}}} \frac{1}{{\sqrt {2\pi } }}\exp \bigg\{ - \frac{{[x - z \sigma _1^2 /(\sigma _1^2 + \sigma _2^2 )]^2 }}{{2[\sigma _1^2 \sigma _2^2 /(\sigma _1^2 + \sigma _2^2 )]}}} \bigg \} \,{\rm d}x. $$ Now, from $\int_{ - \infty }^\infty {x\frac{1}{{\sqrt {2\pi \sigma ^2 } }}{\rm e}^{ - (x - \mu )^2 /(2\sigma ^2 )} \,{\rm d}x} = \mu $ (expectation of a ${\rm N}(\mu,\sigma^2)$ random variable), we can find that $$ {\rm E}(X|X + Y = z) = \frac{{\sigma _1^2 }}{{\sigma _1^2 + \sigma _2^2 }}z. $$

Solution 3:

\begin{equation} p(x|y) = \frac{p(y|x)p(x)}{p(y)} \end{equation}

You know $p(y)$ and $p(x)$.

\begin{equation} p(y|x) = \frac{1}{\sqrt{2 \pi \sigma_N^2}} e^{\frac{-(y-x)^2}{2\sigma_N^2}} \end{equation}

Using these expressions, you can easily get both $p(x|y)$ and $E(X|Y=y)$.