Let $U=\operatorname{min}\{X,Y\}$ and $V=\operatorname{max}\{X,Y\}$. Show that $V-U$ is independent of $U$.

Solution 1:

Well, iff $X,Y$ are independent as well as identically exponentially distributed, then we have the following, by the Law of Total Probability:

$$\def\P{\operatorname{\mathsf P}}\begin{align}\P(U<u, V-U<w) ~=~&{ \P(X<Y,X<u,Y-X<w)+\P(Y\leqslant X,Y<u,X-Y<w)}\\[1ex] =~& \P(X<Y<w+X, X<u)+\P(Y\leqslant X<w+Y, Y< u) \\[1ex] =~& \int_0^u e^{-x}\P(x<Y<w+x\mid X=x)\operatorname d x+\int_0^u e^{-y}\P(y\leqslant X<w+y\mid Y=y)\operatorname d y\\[3ex] \overset{\text{iid}}=~& 2\int_0^u e^{-x}(e^{-x}-e^{-(w+x)})\operatorname d x\\[1ex] =~& 2(1-e^{-w})\int_0^u e^{-2x}\operatorname d x \\[1ex] =~& (1-e^{-w})(1-e^{-2u})\end{align}$$

Do similarly for $\P(U<u)$ and $\P(V-U<v)$

Solution 2:

This follows from properties of the Poisson process. Let two independent Poisson processes, both with rate 1, start at 0. Then $X$ is waiting time before first event in process 1, and $Y$ is waiting time before first event in process 2. If we view the two processes together, combined, it is one Poisson process with rate 2.

Then $U=\min(X,Y)$ is the waiting time before first event in the combined process, and $V-U$ is the interarrival time between first and second event in the combined process. We know that different interarrival times in a poisson process are independent, which gives the result, without any calculations.

Solution 3:

This is related to a certain characterization of the Exponential distribution, namely the independence of $\min(X,Y)$ and $X-Y$ for two absolutely continuous random variables $X$ and $Y$ iff $X$ and $Y$ are independent Exponential random variables with the same location parameter. Here is the relevant article (For similar characterization of Geometric distribution, see this and this).

Joint density of $(X,Y)$ is given by $f_{X,Y}(x,y)=e^{-(x+y)}\mathbf1_{x>0,y>0}$

We transform $(X,Y)\to(X_1,X_2)$ where $X_1=\min(X,Y)$ and $X_2=X-Y$.

For each of the cases $x<y$ and $x\geqslant y$, absolute value of the Jacobian of transformation turns out to be $1$. From this we obtain the joint density of $(X_1,X_2)$, namely

$$f_{X_1,X_2}(x_1,x_2)=\begin{cases}\frac{1}{2}e^{-x_2}.2e^{-2x_1}&,\text{ if }x_2\geqslant0,x_1\geqslant0\\\frac{1}{2}e^{x_2}.2e^{-2x_1}&,\text{ if }x_2<0,x_1\geqslant0\\0&,\text{ otherwise } \end{cases}$$

$$=2e^{-2x_1}.\frac{1}{2}e^{-|x_2|}\mathbf1_{x_1\geqslant0\,,\, x_2\in\mathbb{R}}=f_{X_1}(x_1)f_{X_2}(x_2)$$

This shows the independence of $X_1\sim\text{Exp}$ with mean $1/2$ and $X_2\sim\text{Laplace}(0,1)$.

Now we transform $(X_1,X_2)\to(U,V)$ where $U=X_1$ and $V=|X_2|$

(Note that my notations are different from those in the question)

This is a $2$ to $1$ mapping as we have two preimages of $x_2$. In either case, absolute value of the Jacobian equals $1$. Hence we obtain the joint pdf of $(U,V)$ as

$$f_{U,V}(u,v)=f_{X_1,X_2}(u,v)\cdot1+f_{X_1,X_2}(u,-v)\cdot1$$

$$=2e^{-2u}\mathbf1_{u\geqslant0}\cdot e^{-v}\mathbf1_{v\geqslant0}=f_U(u)f_V(v)$$

This proves the independence of $U=\min(X,Y)$ and $V=|X-Y|=\max(X,Y)-\min(X,Y)$.


The easiest way to show this analytically I think is to use order statistics to find the joint density of $X_{(1)}=\min(X,Y)$ and $X_{(2)}=\max(X,Y)$. Then use a change of variables $(X_{(1)},X_{(2)})\to(U=X_{(1)},V=X_{(2)}-X_{(1)})$. We arrive at the result in no time.