Distribution of the difference of two normal random variables.

If $U$ and $V$ are independent identically distributed standard normal, what is the distribution of their difference?

I will present my answer here. I am hoping to know if I am right or wrong.

Using the method of moment generating functions, we have

\begin{align*} M_{U-V}(t)&=E\left[e^{t(U-V)}\right]\\ &=E\left[e^{tU}\right]E\left[e^{tV}\right]\\ &=M_U(t)M_V(t)\\ &=\left(M_U(t)\right)^2\\ &=\left(e^{\mu t+\frac{1}{2}t^2\sigma ^2}\right)^2\\ &=e^{2\mu t+t^2\sigma ^2}\\ \end{align*} The last expression is the moment generating function for a random variable distributed normal with mean $2\mu$ and variance $2\sigma ^2$. Thus $U-V\sim N(2\mu,2\sigma ^2)$.

For the third line from the bottom, it follows from the fact that the moment generating functions are identical for $U$ and $V$.

Thanks for your input.

EDIT: OH I already see that I made a mistake, since the random variables are distributed STANDARD normal. I will change my answer to say $U-V\sim N(0,2)$.


Solution 1:

The currently upvoted answer is wrong, and the author rejected attempts to edit despite 6 reviewers' approval. So here it is; if one knows the rules about the sum and linear transformations of normal distributions, then the distribution of $U-V$ is: $$ U-V\ \sim\ U + aV\ \sim\ \mathcal{N}\big( \mu_U + a\mu_V,\ \sigma_U^2 + a^2\sigma_V^2 \big) = \mathcal{N}\big( \mu_U - \mu_V,\ \sigma_U^2 + \sigma_V^2 \big) $$ where $a=-1$ and $(\mu,\sigma)$ denote the mean and std for each variable.

Solution 2:

In addition to the solution by the OP using the moment generating function, I'll provide a (nearly trivial) solution when the rules about the sum and linear transformations of normal distributions are known.

The distribution of $U-V$ is identical to $U+a \cdot V$ with $a=-1$. So from the cited rules we know that $U+V\cdot a \sim N(\mu_U + a\cdot \mu_V,~\sigma_U^2 + a^2 \cdot \sigma_V^2) = N(\mu_U - \mu_V,~\sigma_U^2 + \sigma_V^2)~ \text{(for $a = -1$)} = N(0,~2)~\text{(for standard normal distributed variables)}$.


Edit 2017-11-20: After I rejected the correction proposed by @Sheljohn of the variance and one typo, several times, he wrote them in a comment, so I finally did see them. Thank you @Sheljohn!

Solution 3:

With the convolution formula: \begin{align} f_{Z}(z) &= \frac{dF_Z(z)}{dz} = P'(Z<z)_z = P'(X<Y+z)_z = (\int_{-\infty}^{\infty}\Phi_{X}(y+z)\varphi_Y(y)dy)_z \\ &= \frac{1}{2 \pi}\int_{-\infty}^{\infty}e^{-\frac{(z+y)^2}{2}}e^{-\frac{y^2}{2}}dy = \frac{1}{2 \pi}\int_{-\infty}^{\infty}e^{-(y+\frac{z}{2})^2}e^{-\frac{z^2}{4}}dy = \frac{1}{\sqrt{2\pi\cdot 2}}e^{-\frac{z^2}{2 \cdot 2}} \end{align} whichi is density of $Z \sim N(0,2)$. Interchange of derivative and integral is possible because $y$ is not a function of $z$, after that I closed the square and used Error function to get $\sqrt{\pi}$. Integration bounds are the same as for each rv.