How to explain why the probability of a continuous random variable at a specific value is 0?

Solution 1:

A continuous random variable can realise an infinite count of real number values within its support -- as there are an infinitude of points in a line segment.

So we have an infinitude of values whose sum of probabilities must equal one. Thus these probabilities must each be infinitesimal. That is the next best thing to actually being zero. We say they are almost surely equal to zero.

$$\Pr(X=x) = 0 \text{ a.s.}$$

( To have a sensible measure of the magnitude of these infinitesimal quantities, we use the concept of probability density, which yields a probability mass when integrated over an interval. This is, of course, analogous to the concepts of mass and density of materials. )

$$f_X(x) = \frac{\mathrm d}{\mathrm d x}\Pr(X\leq x)$$


For the non-uniform case, I can pick some 0's and others non-zeros and still be theoretically able to get a sum of 1 for all the possible values.

You are describing a random variable whose probability distribution is a mix of discrete (massive) points and continuous intervals. This has step discontinuities in the cumulative distribution function.

$\Pr(X\leq x) = \begin{cases} 0 & x < 0 \\ 0.25 & x=0 \\ 0.25+x/4 & 0< x< 1/2 \\ 0.75 & x=1/2 \\ 0.5+x/4 & 1/2< x< 1 \\ 1 & x\geq 1\end{cases}\\[2ex] \Pr(X=x) = \begin{cases} 0 & x<0 \cup x> 1 \\ 0.25 & x=0 \cup x=1/2 \cup x=1 \\ 0\text{ a.s.} & 0<x<1/2 \cup 1/2<x< 1 \end{cases}$

Solution 2:

The situation is easier to formalize when we say that there is no uniform distribution on a countably infinite set $S$. This is because if there were, then $P(X=x)=c>0$ for every $x$, and now

$$P(X \in S)=\sum_{x \in S} c = +\infty.$$

This follows from the property of countable additivity of probability, which is usually treated as an axiom. I think that at least finite additivity of probability should be intuitively obvious, and countable additivity is a straightforward extension.

In the uncountable case, we need one more result, which can be proven from the countable additivity axiom: if $A \subset B$ then $P(A) \leq P(B)$; so now if $S$ is uncountable, and we are to assign a uniform distribution to it, then we can extract a countably infinite subset $C$. Then

$$P(X \in S) \geq P(X \in C) = \sum_{x \in C} c = +\infty$$

as before.

Solution 3:

Use the fact that if $A \subset B$, then $p(A) \le p(B)$.

If $X$ is a continuous random variable, this means that the cdf is continuous, that is $f(\alpha) = p(\{\omega | X(\omega) \le \alpha \})$ is continuous.

It follows that $p((a,b]) = f(b)-f(a)$.

Suppose $a < x \le b$. Then we have $p((a,b]) = f(b)-f(a)$.

Since $\{x\} \subset [a,b]$ for all such $a,b$, we have $p(\{x\}) \le f(b)-f(a)$ for all such $a,b$. Hence $p(\{x\})=0$, since $f$ is continuous.