Independence of a random variable $X$ from itself

In our lecture on probability, my professor made the comment that "a random variable X is not independent from itself." (Here he was specifically talking about discrete random variables.) I asked him why that was true. (My intuition for two counterexamples are $X \equiv 0$ and $X$ s.t. $$m_X(x) = \begin{cases}1, &\text{ if } x = x_0\\ 0, &\text{ if }x \neq x_0.)\end{cases}$$

In these cases, it seems that $\mathbb{P}(X \leq x_1 , X \leq x_2) = \mathbb{P}(X \leq x_1) \cdot \mathbb{P}(X \leq x_2)$.

My professor's response was, "The independence from or dependence of $X$ on itself depends on the definition of the joint distribution function $m_{X,X}$, which is essentially arbitrary."

Can someone help me to understand this?


Solution 1:

The only events that are independent of themselves are those with probability either $0$ or $1$. That follows from the fact that a number is its own square if and only if it's either $0$ or $1$. The only way a random variable $X$ can be independent of itself is if for every measurable set $A$, either $\Pr(X\in A)=1$ or $\Pr(X\in A)=0$. That happens if and only if $X$ is essentially constant, meaning there is some value $x_0$ such that $\Pr(X=x_0)=1$.

Solution 2:

Suppose X is an event from a sample space S, and suppose that X is independent of itself. This means that P(X/X)=P(X). But by definition,

           P(X∩X)=P(X)P(X/X)= P(X)P(X)=P(X)^2

but P(X∩X)=P(X), since X∩X=X. Therefore, P(X)=P(X/X)=P(X)P(X)=P(X)^2.

This result only can occur when P(X)=1 or P(X)=0. In conclusion, an event X from a sample space S can be independent of itself only if P(X)=1 or P(X)=0.