Sum of discrete and continuous random variables with uniform distribution

Solution 1:

You can extend the convolution method for summing continuous independent variables if you identify the "density" of a discrete variable as a sum of Dirac deltas. Here you find that the density of $X$ is $1_{[0,1]}$ while the "density" of $Y$ is $\frac{1}{3} \sum_{i=-1}^1 \delta_i$. So you now need to compute the convolution:

$$g(x) = \frac{1}{3} \sum_{i=-1}^1 \int_{-\infty}^\infty \delta_i(y) 1_{[0,1]}(x-y) dy \\ = \frac{1}{3} \sum_{i=-1}^1 \int_{x-1}^x \delta_i(y) dy$$

By the definition of the Dirac delta, the $i$th summand is $1$ if $i \in (x-1,x)$ and zero otherwise. So exactly one summand will be $1$ if and only if $x \in (-1,2)$, so the density here is $\frac{1}{3} 1_{(-1,2)}$.

In this case this is pretty much the same as the other answer; this method is nicer in more complicated examples.

Solution 2:

Notice, that events $Z \in [-1,0)$, $Z \in [0,1)$ and $Z \in [1,2)$ have empty intersection. But the according to your solution: $$P(Z \in [-1,0)) + P(Z \in [0,1)) + P(Z \in [1,2)) > 1$$ which clearly could not be correct.

So lets count, the desired probabilities once again, using the fact that X,Y are independent: \begin{align} P(Z \in [-1,0)) &= P(X \in [0,1))P(Y=-1) = 1\times\frac{1}{3}=\frac{1}{3}, \\ P(Z \in [0,1)) &= P(X \in [0,1))P(Y=0) + P(X = 1)P(Y=-1) = 1\times\frac{1}{3} + 0\times\frac{1}{3} = \frac{1}{3}, \\ P(Z \in [1,2)) &= P(X \in [0,1))P(Y=1) + P(X = 1)P(Y=0) = 1\times\frac{1}{3} + 0.\times\frac{1}{3} = \frac{1}{3}. \end{align} Of course $P(Z < - 1)= 0$, but we have already counted $$P(Z \geq 1) = P(Z \in [1,2)) + P(Z \geq 2) = \frac{1}{3} + 0.$$