Finding a random vector with a pre-specified distribution

Consider a bivariate probability distribution $G$ such that:

(a) $G$ has full support.

(b) The marginals of $G$ are identical

(c) The marginals of $G$ are symmetric around 0.

Can we always find a random vector $(X_1,X_2,X_3)$ such that $$ (1) \quad \begin{pmatrix} X_1-X_3\\ X_1-X_2 \end{pmatrix}\sim \begin{pmatrix} X_2-X_3\\ X_2-X_1 \end{pmatrix}\sim \begin{pmatrix} X_3-X_1\\ X_3-X_2 \end{pmatrix}\sim G \quad ? $$ The symbol "$\sim$" means "distributed as".


Let me give you an example. Suppose $G\sim \mathcal{N}\Big(\begin{pmatrix} 0\\0 \end{pmatrix}, \begin{pmatrix} 2 & 1\\ 1 & 2 \end{pmatrix}\Big)$. Observed that $G$ satisfies (a)-(c). For such a $G$, we can find $(X_1,X_2,X_3)$ satisfying (1). For instance, we can take $(X_1,X_2,X_3)\sim \mathcal{N}\Big(\begin{pmatrix} 0\\0\\0 \end{pmatrix}, \begin{pmatrix} 1& 0 & 0\\ 0 & 1 & 0\\ 0 & 0 & 1\end{pmatrix}\Big)$.

However, I'm wondering whether there may be other $G$ satisfying (a)-(c) (but not necessarily Normal, as in my example) which do not allow us to find $(X_1,X_2,X_3)$ satisfying (1). Can we exclude so?


Solution 1:

Here is a counter-example.

  • Let $G\sim \mathcal{N}\Big(\begin{pmatrix} 0\\0 \end{pmatrix}, \begin{pmatrix} 2 & 1\\ 1 & 2 \end{pmatrix}\Big)$.

  • Let $H$ be as follows: $Prob(H = \begin{pmatrix} 1\\1 \end{pmatrix}) = Prob(H = \begin{pmatrix} -1\\-1 \end{pmatrix}) = \frac12$.

  • Let $B$ be a coin toss with $Prob(B = Head) = 0.9, Prob(B = Tail) = 0.1$.

  • Finally, let $W$ be the mixture of $G$ and $H$:

    • if $B = Head$, then $W = H$

    • if $B = Tail$, then $W = G$.

Clearly, $W$ satisfy (a,b,c). Also, the two components of $W$, denoted $W_1$ and $W_2$, are equal with prob $0.9$.

Now assume for later contradiction that (1) can be satisfied for $W$. From the first term, we have

$$0.9 = Prob(W_1 = W_2) = Prob(X_1 - X_3 = X_1 - X_2) = Prob(X_2 = X_3)$$

Similarly from the second and third terms, we have $P(X_1 = X_2) = Prob(X_3 = X_1) = 0.9$.

So $Prob(X_1 \neq X_2) = 0.1$ etc. From the union bound:

$$P(X_1 \neq X_2 \cup X_2 \neq X_3 \cup X_3 \neq X_1) \le 0.3$$

Taking the complement, we have:

$$P(X_1 = X_2 = X_3) \ge 1 - 0.3 = 0.7$$

But this means each of the 3 terms has prob at least $0.7$ of being the zero-vector, and therefore each term is not $\sim W$.


As requested, a smoothed example:

  • Pick some $\sigma \ll 1$.

  • Let $A \sim \mathcal{N}\Big(\begin{pmatrix} 1\\1 \end{pmatrix}, \begin{pmatrix} \sigma^2 & 0\\ 0 & \sigma^2 \end{pmatrix}\Big)$.

  • Let $B\sim \mathcal{N}\Big(\begin{pmatrix} -1\\-1 \end{pmatrix}, \begin{pmatrix} \sigma^2 & 0\\ 0 & \sigma^2 \end{pmatrix}\Big)$.

  • Let $D$ be the 50-50 mixture of $A$ and $B$. Let $D_1, D_2$ be the two components of $D$.

Clearly $D_1, D_2$ are "close" with high probability. E.g. if each term were within $K \sigma$ of the mean (either $1$ or $-1$ depending on whether $D = A$ or $B$), then their difference will be within $2K \sigma$. For convenience define the events:

  • $close(P,Q)$ as $(|P - Q| < 2K\sigma)$, and $far(P,Q)$ as the complement

we have:

$$P(close(D_1,D_2)) \ge P(D_1 \ within\ K\sigma \cap D_2 \ within\ K \sigma) = 1 - \epsilon$$

$$P(far(D_1,D_2)) \le \epsilon$$

for some small $\epsilon$, whose value can be computed easily using Z-tables.

Next, we assume (for future contradiction) that each of the three terms $\sim D$. Then from the first term we have $P(close(D_1,D_2)) = P(close(X_1 - X_2,X_1 - X_3)) = P(close(X_2, X_3))$, and similarly for the second and third terms. From the union bound:

$$ P(far(X_1, X_2) \cup far(X_2, X_3) \cup far(X_3, X_1)) \le 3 \epsilon$$

Finally taking the complement:

$$P(close(X_1, X_2) \cap close(X_2, X_3) \cap close(X_3, X_1)) \ge 1 - 3 \epsilon$$

Recalling the definition of $close()$ we have:

$$P(|X_1 - X_2| < 2K \sigma) \ge 1 - 3 \epsilon$$

etc. Now all we need is to choose the parameters, e.g. $\sigma = 0.001, K = 10$ should give a very small $\epsilon$ s.t. none of the three terms can be $\sim D$.