If you have two envelopes, and ...
Suppose you're given two envelopes. Both envelopes have money in them, and you're told that one envelope has twice as much money as the other. Suppose you pick one of the envelopes. Should you switch to the other one?
Intuitively, you don't know anything about either envelope, so it'd be ridiculous to say that you should switch to the other envelope to maximize your expected money.
However, consider this argument. Let $x$ be the amount of money in the envelope you picked. If $y$ is the amount of money in the other envelope, then the expected value equals
$$E(y) = \frac{1}{2}\left(\frac{1}{2}x\right) + \frac{1}{2}\left(2x\right) = \frac{5}{4} x$$
But $5x/4 > x$, so you should switch!
The Wikipedia article says that $x$ stands for two different things, so this reasoning doesn't work. I say this is not a valid resolution.
Consider opening up the envelope that you pick, and finding $\$10$ inside. Then you can run the expected value calculation to get $$E(y) = \frac{1}{2} \cdot \$5+\frac{1}{2} \cdot \$20 = \$12.50$$
This means that if you open one of the envelopes and find $\$10$, you should switch to the other envelope. The $\$10$ doesn't stand for two different things, it literally just means $\$10$.
But you don't have to open up the envelope to run this calculation, you can just imagine what's inside, and run the calculation based on that. This is what "Let $x$ be the amount in the envelope" means. The problem with the argument is not that $x$ stands for two different things.
So what is the problem?
Previous questions on stack exchange have given the resolution that I just said I wasn't satisfied by, so please don't mark this as a duplicate. I want a different resolution, or a more satisfying explanation of why $x$ does stand for two different things.
Apparently there is still research being published about this problem - maybe it isn't so obvious?
I think there's something subtle wrong with the premise. Because there's no uniform probability distribution on $\mathbb{R}$, statements like "random real number" are not well-defined. Likewise, I think "one envelope has twice as much money as the other" assumes some probability distribution on $\mathbb{R}$, and perhaps our expected value calculation assumes that this distribution is uniform, which it cannot be ...
The assumption that is unrealistic is that there is a $\frac12$ chance that the other envelope contains twice the money. Realistically, there is an underlying distribution of values and that distribution dictates the probability that a given amount is the smaller.
1. Analysis of the Two Envelope Paradox
Let the value of a pair of envelopes (POE) be the smaller of the values of the envelopes. Let the pdf of the value of the POE be $f(a)$. That is, the probability that the value of a POE is between $a$ and $a+\mathrm{d}a$ is $f(a)\,\mathrm{d}a$.
The expected value of a randomly chosen envelope is $$ \begin{align} E &=\frac12\int f(a)\,a\,\mathrm{d}a+\frac12\int f(a)\,2a\,\mathrm{d}a\\ &=\frac32\int f(a)\,a\,\mathrm{d}a\tag{1} \end{align} $$ We will assume that this exists
2. Conditional Probabilities
The probability that the value of the POE was between $\frac a2$ and $\frac a2+\frac{\mathrm{d}a}2$ and that we chose the larger is $\frac12f\left(\frac a2\right)\frac{\mathrm{d}a}2$. The probability that the value of the POE was between $a$ and $a+\mathrm{d}a$ and that we chose the smaller is $\frac12f(a)\,\mathrm{d}a$. Thus, the probability that the value of the envelope we chose is between $a$ and $a+\mathrm{d}a$ is $\frac14\left(f\left(\frac a2\right) + 2 f(a)\right)\mathrm{d}a$. Therefore, we define $$ P(a)=\frac14\left[f\left(\frac a2\right) + 2 f(a)\right]\tag{2} $$ Furthermore, given that the value of the envelope we chose was between $a$ and $a+\mathrm{d}a$, the probability that we chose the envelope with the larger value is $$ L(a)=\frac{f\left(\frac a2\right)}{f\left(\frac a2\right)+2f(a)}\tag{3} $$ and the probability that we chose the envelope with the smaller value is $$ S(a)=\frac{2f(a)}{f\left(\frac a2\right)+2f(a)}\tag{4} $$ This is where the unrealistic assumption falls apart. Without knowledge of $f$, we cannot know the conditional probabilities $L$ and $S$; they are usually not $\frac12$ and $\frac12$.
3. Strategies
Always Switch
Suppose we switch all the time. Then our expected value is
$$
\begin{align}
&\int\left[L(a)\frac a2+S(a)2a\right]P(a)\,\mathrm{d}a\\
&=\frac14\int\left[f\left(\frac a2\right)\frac a2+2f(a)\,2a\right]\,\mathrm{d}a\\
&=\frac32\int f(a)\,a\,\mathrm{d}a\\[8pt]
&=E\tag{5}
\end{align}
$$
Always Stay
Suppose we stay all the time. Then our expected value is
$$
\begin{align}
&\int\left[\vphantom{\int}L(a)\,a+S(a)\,a\right]P(a)\,\mathrm{d}a\\
&=\frac14\int\left[f\left(\frac a2\right)\,a+2f(a)\,a\right]\,\mathrm{d}a\\
&=\frac32\int f(a)\,a\,\mathrm{d}a\\[8pt]
&=E\tag{6}
\end{align}
$$
Therefore, the expected value is $E$ whether we switch all the time or stay. This is comforting since intuition says that switching should not help.
Better Strategy
However, there is a strategy that does give us a better expected value. Choose any function $k:[0,\infty)\to[0,1]$ such that $k(2a)\gt k(a)$; a monotonic increasing function for example. If an envelope has value $a$, keep it with probability $k(a)$ and switch otherwise. Then the expected value is
$$
\begin{align}
&\int L(a)\left[k(a)a+(1-k(a))\frac a2\right]P(a)\,\mathrm{d}a\\
&+\int S(a)\left[\vphantom{\int}k(a)a+(1-k(a))2a\right]P(a)\,\mathrm{d}a\\[3pt]
&=\frac14\int f\left(\frac a2\right)\left[k(a)a+(1-k(a))\frac a2\right]\,\mathrm{d}a\\
&+\frac14\int2f(a)\left[\vphantom{\int}k(a)a+(1-k(a))2a\right]\,\mathrm{d}a\\[3pt]
&=\frac32\int f(a)\,a\,\mathrm{d}a
+\frac12\int f(a)\left[\vphantom{\int}k(2a)-k(a)\right]a\,\mathrm{d}a\\[3pt]
&=E+\frac12\int f(a)\left[\vphantom{\int}k(2a)-k(a)\right]a\,\mathrm{d}a\tag{7}
\end{align}
$$
which, if $k(2a)\gt k(a)$, is better than $E$. If $k(a)$ is constant, as it is in the previous strategies, the expected value is $E$.
My Opinion: You aren't using the notion of random variables correctly. You can't have that $E[Y]=X$ if these are random variables. $E[Y]$ is a fixed number, not a random variable. So your reasoning is incorrect. And, I would think, it really does come down to how you define what $X$ and $Y$ are. You can't be vague about that and expect to apply mathematical techniques to an ill-defined quantity with meaningful results.
I have read countless explanations about this game and so far none of them has been satisfactory. So I wrote my own explanation.
Think of this different game: The host offers you $x$. You can take it or you can flip a coin. If you get heads you receive $2x$, if you get tails you receive $x/2$. What should you do? You should obviously always flip the coin because your expected value from each single wager is exactly $5x/4$.
So how does the envelope paradox differ from this game? The difference is, there is a maximum and a minimum in the envelope paradox. When you choose an envelope and open it up and see $100$ dollars inside, the probabilities are not $1/2$ for this envelope to be the smaller or the larger one. If there was a distribution of envelopes which were uniform for each amount, meaning for any $100$ dollar envelope, there were an equal number of $200$ dollar envelopes and $50$ dollar envelopes, then switching every time will indeed get you more money in average. But that would mean that there would have to be the same amount of $400$ dollar envelopes, therefore the same amount of $800$ dollar envelopes, therefore the same amount of $2^n100$ dollar envelopes for every $n$ in integers. Which is impossible, since there's only a finite amount of money in the world on one hand and there's no money less than $1$ cent on the other.
So the best possible scenario is that you have a uniform distribution of the amounts in tuples except the largest and the smallest ones in the same sequence $x,2x,4x,...,2^nx$ for each different value of $x$. Example:
You have an equal number of $(1,2)$, $(2,4)$,$(4,8)$,$(8,16)$,$(32,64)$ and $(64,128)$ tuples. Now whenever you open an envelope and see $2$, half the time the other envelope is $1$, half the time it's $4$. But this is not true for $1$ and $128$. If you see a $1$, all the time the other envelope is $2$ and when you see a $128$, the other envelope is always $64$. This is the whole reason this paradox arises. Now if you always switch or never switch you get the exact same amount on average because the $5x/4$ expected value you get from the amounts in the middle of the distribution are canceled out by the lower than $x$ expected value you get from the lowest and highest tuples of envelopes.
The exact calculation is as follows: If you don't know the minimum and the maximum amount and always switch, if you keep getting envelopes with $2,4,8,...64$ dollars, switching will win you exactly $5x/4$ instead of $x$ in the envelope. So when you calculate the total expected value for the numbers in the middle, never switching gets you exactly $2+2+4+4+8+8+16+16+32+32+64+64=252$. Always switching gets you $1+4+2+8+4+16+8+32+16+64+32+128=315$ which is exactly $252/4*5$.
Now on the extremes, when you get the $1$ envelope and switch, your expected value is $2$, but when you get the $128$ and switch, your expected value is $64$. So now your expected value from not switching is $1+128=129$ but from switching is $2+64=66$. When you add them up $252+129=315+66=381$.
If you somehow knew that the minimum amount in this game is $1$ and the maximum is $128$, then by always switching except at $128$ your average winnings is much better than never switching.
Which means that if in your mind there's a minimum and a maximum amount of money you want to get out of this game, say $m$ and $M$, the minimum amount in any envelope is $min$ the maximum is $max$ and $2min \leqslant m< M \leqslant max/2$ then you definitely win more by always switching compared to never switching.
Now if you assume that there are no limits to the amounts of money in the envelopes either on the lowest or the highest end, and they are uniformly distributed, then always switching wins you $5x/4$ instead of $x$, so there's nothing wrong with the solution. But this game is impossible in the real world, so it's similar to the St. Petersburg game in this respect. The problem about expected value arises in both of them because we assume the total money in the world is infinite.
And obviously the finite version of the game does not have to consist of multiples of the same $x$ like a sequence of $x,2x,4x,8x,...,2^nx$ but you can partition the game into these kinds of sequences for different values of $x$ as long as you know the distribution is uniform for each value in each sequence except the highest and lowest ones. Like you can have an equal amount of $(1,2)$,...,$(64,128)$ tuples and an equal amount of $(3,6)$,...$(96,192)$ tuples. As long as the probabilities for each tuple are equally likely the same explanation works.