I'm really having a hard time with this topic in probability theory and I was wondering if someone has any tricks, tips or anything useful to help me understand it. In my notes I am told that $X\sim$Uniform$[0,1]$ and $Y\sim$Uniform$[0,1]$. My professor wrote:

$f_{X+Y}(Z) = \int_0^Z f_X(x) f_Y(Z-x)\;dx \neq Z$

Then somehow he came up with these cases where $Z$ is defined, and used that in making the pdf. I'm just not sure of how he did this or how he also came to find $P(X_1+X_2\le 2)$. Can someone help me fill in the blanks? I went to him directly but he just said the same thing as in lecture and wasn't helpful.


Solution 1:

Let $Z = X+Y$. Then, if $X$ and $Y$ have values in $[0,1]$, it follows that $0 \leq Z \leq 2$. Thus, it is obvious that the cumulative probability distribution function $F_Z(z) = P\{Z \leq z\}$ enjoys the property that $F_z(z) = 0$ for $z < 0$ and $F_Z(z) = 1$ for $z \geq 1$. More generally, for any fixed value of $z$, $$F_Z(z) = P\{Z \leq z\} = P\{X+Y \leq z\} = \int_{-\infty}^{\infty}\left[ \int_{-\infty}^{z-x} f_{X,Y}(x,y)\,\mathrm dy\right]\,\mathrm dx$$ and so, using the rule for differentiating under the integral sign (see the comments following this answer if you have forgotten this) $$\begin{align*} f_Z(z) &= \frac{\partial}{\partial z}F_Z(z)\\ &= \frac{\partial}{\partial z}\int_{-\infty}^{\infty}\left[ \int_{-\infty}^{z-x} f_{X,Y}(x,y)\,\mathrm dy\right] \,\mathrm dx\\ &= \int_{-\infty}^{\infty}\frac{\partial}{\partial z}\left[ \int_{-\infty}^{z-x} f_{X,Y}(x,y)\,\mathrm dy\right]\,\mathrm dx\\ &= \int_{-\infty}^{\infty} f_{X,Y}(x,z-x)\,\mathrm dx \end{align*}$$ When $X$ and $Y$ are independent random variables, the joint density is the product of the marginal densities and we get the convolution formula $$f_{X+Y}(z) = \int_{-\infty}^{\infty} f_{X}(x)f_Y(z-x)\,\mathrm dx ~~ \text{for independent random variables} ~X~\text{and}~Y.$$ When $X$ and $Y$ take on values in $[0,1]$, we have that $f_X(x) = 0$ for $x<0$ and $x>1$, and so $$f_{X+Y}(z) = \int_{0}^{1} f_{X}(x)f_Y(z-x)\,\mathrm dx.$$ Furthermore, for fixed $z, 0 < z < 1$, as $x$ sweeps from $0$ to $1$, $f_Y(z-x)$ goes to $0$ as soon as $x$ exceeds $z$, and so $$f_{X+Y}(z) = \int_{0}^{z} f_{X}(x)f_Y(z-x)\,\mathrm dx, ~~ 0 \leq z \leq 1.$$ Similarly, if $z \in [1,2]$, $f_Y(z-x) = 0$ as long as $x < z-1$ and so $$f_{X+Y}(z) = \int_{z-1}^{1} f_{X}(x)f_Y(z-x)\,\mathrm dx, ~~ 1 \leq z \leq 2.$$ Finally, if $X$ and $Y$ are uniformly distributed on $[0,1]$, the integrands above have value $1$ and we get $$f_{X+Y}(z) = \begin{cases} z, & 0\leq z \leq 1,\\ 2-z, & 1 \leq z \leq 2,\\ 0, &\text{otherwise.}\end{cases}$$

Solution 2:

In terms of upper and lower cases, I suspect you saw $$f_{X+Y}(z)=\int_0^z f_X(x)f_Y(z-x)dx$$ as the convolution of two non-negative random variables.

With Y Uniform on $[0,2]$:

Taking account of where $f_X$ and $f_Y$ are non-zero, I would write $$f_{X+Y}(z)=\int_{\max(0,z-2)}^{\min(1,z)} f_X(x)f_Y(z-x)dx = \int_{\max(0,z-2)}^{\min(1,z)} \frac{1}{2} dx $$ which leads to

$$f_{X+Y}(z) = \begin{cases} z/2 &\mbox{if } 0 \le z \le 1 \\ 1/2 &\mbox{if } 1 \le z \le 2 \\ (3-z)/2 &\mbox{if } 2 \le z \le 3. \end{cases} $$

You then get $$P(X_1+X_2\le 2) = \int_0^2 f_{X+Y}(z)dz = \int_0^1 f_{X+Y}(z)dz + \int_1^2 f_{X+Y}(z)dz = \frac14+\frac12 =\frac34.$$


With the change in the question of Y to be Uniform on $[0,1]$, the above becomes:

Taking account of where $f_X$ and $f_Y$ are non-zero, I would write $$f_{X+Y}(z)=\int_{\max(0,z-1)}^{\min(1,z)} f_X(x)f_Y(z-x)dx = \int_{\max(0,z-1)}^{\min(1,z)} 1 \, dx $$ which leads to

$$f_{X+Y}(z) = \begin{cases} z &\mbox{if } 0 \le z \le 1 \\ 2-z &\mbox{if } 1 \le z \le 2. \end{cases} $$

You then get $$P(X_1+X_2\le 2) = \int_0^2 f_{X+Y}(z)dz = \int_0^1 f_{X+Y}(z)dz + \int_1^2 f_{X+Y}(z)dz = \frac12+\frac12 =1$$ which is in fact obvious.