The role of the "hidden" probability space on which random variables are defined

Solution 1:

The distributions of each $X_i$ and each $Y_i$ are far from being sufficient to decide anything about the families $(X_i)_i$ and $(Y_i)_i$.

Assume for instance that $X_1$, $X_2$, $Y_1$ and $Y_2$ are all uniform $\pm1$ Bernoulli random variables and that $X_1=X_2=Y_1=-Y_2$. Then the event $[X_1=1\ \mbox{or}\ X_2=1]$ has probability $\frac12$ while the event $[Y_1=1\ \mbox{or}\ Y_2=1]$ has probability $1$.

Solution 2:

The question is not very clear, but here is some idea. Let $U$ be a uniform$[0,1]$ random variable, and define a random process $X=\{X_t: t \in [0,1]\}$ by $X_t = \mathbf{1}(t=U)$, where $\mathbf{1}$ is the indicator function. Then, $X$ is identical in law to the zero process $Y$ defined by $Y_t = 0$ for all $t \in [0,1]$; that is ${\rm P}[X_{t_1} = 0, X_{t_2}=0,\ldots,X_{t_n}=0]=1$ for any choice of $n \geq 1$ and $0 \leq t_1 < \ldots \leq t_n \leq 1$. However, $X$ and $Y$ are quite different: $X$ is not continuous, its $\sup$ is equal to $1$, etc.