Justifying existence of dependent variables $X,Y$ with $P(X = x_i \cap Y = y_j) = \delta_{ij}p_i$

Recently, I had to check the following inequality: $$M = \left(\sum_ip_ix_i^2-\left(\sum_i p_ix_i\right)^2\right)\left(\sum_ip_iy_i^2-\left(\sum_i p_iy_i\right)^2\right) - \left( \sum_ip_ix_i\sum_ip_iy_i - \sum_ip_ix_iy_i\right)^2\geq 0$$ for $p_i\in[0,1]$ with $\sum_ip_i = 1$ and $x_i,y_i\in\mathbb{R}$ for given $n$ and $i=1,2,\dots n.$ This just turns out to be Cauchy-Schwarz by dealing with some double indices:

$$M = \sum_{i<j}p_ip_j(x_i-x_j)^2\sum_{i<j}p_ip_j(y_i-y_j)^2 - \left(\sum_{i<j}p_ip_j(x_i-x_j)(y_j-y_i)\right)^2\geq 0.$$

But the given form of $M$ is practically begging for a probabilistic interpretation where $X,Y$ are discrete random variables taking values $x_i,y_i$ respectively with probability $p_i$ and with this interpretation the $M$ looks like:

$$M = \left(E[X]^2-E[X]^2\right)\left(E[Y]^2-E[Y]^2\right) - \left(E[X]E[Y]-(E[XY])\right)^2=\text{Var}(X)\text{Var}(Y) - \text{Cov}(X,Y)^2\geq 0.$$

Except that this is slightly flawed, because the term $E[XY]$ is actually: $$E[XY] = \sum_{i,j} x_iy_j\mathbb{P}(X = x_i \cap Y = y_j )\neq \sum_i p_ix_iy_i$$ in general. So naturally, I am trying to salvage this by saying $X,Y$ are random variables such that: $$\mathbb{P}(X = x_i \cap Y = y_j ) = \delta_{ij}p_i=\delta_{ij}p_j\,\,\, (1).$$

So these $X$ and $Y$ that I am trying to construct are dependent on each other in the sense that $X = x_i \iff Y = y_i.$

But my probability foundation is a bit shaky and keep wondering if this existence claim requires further justification. So my question is how do I formally establish that there exists $X,Y$ such that $(1)$ holds ? Or if not, is there an easy counterexample?


If $\mu$ is a probability measure on some sample space $S$, then naturally the identity function $\operatorname{Id}_S: S \rightarrow S$ can be seen as a random variable with distribution $\mu$. In other words, to check wheter there exists random variables $X$ and $Y$ with distribution $$\mathbb{P}(X=x_i ,Y=y_j) = \delta_{ij} p_i,$$ it suffices to check that this is a valid probability distribution. It clearly is a valid probability distribution, since $\delta_{ij}p_i \geq 0$ for all $i,j$ and $$\sum_{i,j} \delta_{ij}p_i = \sum_{i}p_i = 1.$$