A winning wager that loses over time
Here are all the possible outcomes if you stop after ten tosses of the coin. I have rounded the numbers for display, but the calculations were at the full precision of the software on which they were computed. In the table, $H$ is a random variable equal to the number of heads in the sequence of flips, and $X$ is a random variable equal to the amount of your money on the table at the end of the sequence. The last number in the fifth or seventh column is the sum of the numbers in the column above it, so the last number in the fifth column is the expected value of the money on the table at the end of the game, computed in the usual way.
\begin{array}{ccccccc} h & P(H=h) & P(H\leq h) & X & XP(H=h) & \log_{10}X & (\log_{10}X)P(H=h) \\ \phantom{0} 0 & 0.000977 & 0.000977 &\phantom{0} 15.52 &\phantom{00} 0.02 & 1.1908 & 0.0012 \\ \phantom{0} 1 & 0.009766 & 0.010742 &\phantom{0} 22.43 &\phantom{00} 0.22 & 1.3509 & 0.0132 \\ \phantom{0} 2 & 0.043945 & 0.054688 &\phantom{0} 32.43 &\phantom{00} 1.43 & 1.5110 & 0.0664 \\ \phantom{0} 3 & 0.117188 & 0.171875 &\phantom{0} 46.89 &\phantom{00} 5.50 & 1.6711 & 0.1958 \\ \phantom{0} 4 & 0.205078 & 0.376953 &\phantom{0} 67.79 &\phantom{0} 13.90 & 1.8312 & 0.3755 \\ \phantom{0} 5 & 0.246094 & 0.623047 &\phantom{0} 98.02 &\phantom{0} 24.12 & 1.9913 & 0.4900 \\ \phantom{0} 6 & 0.205078 & 0.828125 & 141.71 &\phantom{0} 29.06 & 2.1514 & 0.4412 \\ \phantom{0} 7 & 0.117188 & 0.945313 & 204.88 &\phantom{0} 24.01 & 2.3115 & 0.2709 \\ \phantom{0} 8 & 0.043945 & 0.989258 & 296.21 &\phantom{0} 13.02 & 2.4716 & 0.1086 \\ \phantom{0} 9 & 0.009766 & 0.999023 & 428.26 &\phantom{00} 4.18 & 2.6317 & 0.0257 \\ 10 & 0.000977 & 1.000000 & 619.17 &\phantom{00} 0.60 & 2.7918 & 0.0027 \\ & & & & 116.05 & & 1.9913 \end{array}
So we see you have about a $62\%$ chance of losing money, although less than $38\%$ chance to lose more than two dollars. On the other hand you could win over $\$500$; the chance of winning at least $\$100$ is over $17\%$, while the chance of losing $\$100$ is zero. Add up the ordinary expected value of the game, and it comes out to an expected gain of about $\$16.05.$
But if we take the expected base-ten logarithm of the amount of your money on the table after ten tosses (the number at the bottom of the last column), we see that it is only about $1.9913,$ whereas the base-ten logarithm of your starting amount is exactly $2.$
What does this mean? It means that when you measure your outcomes according to a utility function that is not necessarily identical to the raw outcomes (the number of dollars on the table), the expected utility of the game can be positive or negative, depending on your utility function. If your utility function is $f(x) = x$, this game has positive expected utility, but if your utility function $f(x) = \log_{10} x,$ this game has negative expected utility. That is, $E(X - 100) > 0$ but $E(\log_{10} X - \log_{10} 100) < 0.$
Note that using a different base of the logarithm will apply a different scaling factor to the utility, but it will still be negative. I used base $10$ because it gives a nice round number for the initial utility, so it is easy to see when you gain and when you lose.
Note that if we apply logarithmic utility in this way to a game where you put $\$100$ on the table and then flip a fair coin once, double or nothing, you have a $\frac12$ chance to increase your utility by about $0.3$ and a $\frac12$ chance to decrease your utility by $\infty.$ This is an absurd result, and is due to the absurdity of using "logarithm of money on the table" as a utility function. A somewhat more reasonable utility function for a casino game is to assume that you have some reserves of some kind somewhere that you do not put on the table, and take the logarithm of the sum of your reserves plus the money on the table. Suppose we only count the money in your bank account as these "reserves", and assume you had only $\$1000$ at the start of the day and withdrew $\$100$ in order to play the game. Then the utility of your initial state is $\log_{10}1000 = 3,$ and the utility of your outcome is $\log_{10}(900 + X).$ We get the following results:
\begin{array}{ccccc} h & P(H=h) & P(H\leq h) & 900 + X & \log_{10}(900+X) & (\log_{10}(900+X))P(H=h) \\ \phantom{0} 0 & 0.000977 & \phantom{0} 15.52 &\phantom{0}915.52 & 2.9617 & 0.0029 \\ \phantom{0} 1 & 0.009766 & \phantom{0} 22.43 &\phantom{0}922.43 & 2.9649 & 0.0290 \\ \phantom{0} 2 & 0.043945 & \phantom{0} 32.43 &\phantom{0}932.43 & 2.9696 & 0.1305 \\ \phantom{0} 3 & 0.117188 & \phantom{0} 46.89 &\phantom{0}946.89 & 2.9763 & 0.3488 \\ \phantom{0} 4 & 0.205078 & \phantom{0} 67.79 &\phantom{0}967.79 & 2.9858 & 0.6123 \\ \phantom{0} 5 & 0.246094 & \phantom{0} 98.02 &\phantom{0}998.02 & 2.9991 & 0.7381 \\ \phantom{0} 6 & 0.205078 & 141.71 & 1041.71 & 3.0177 & 0.6189 \\ \phantom{0} 7 & 0.117188 & 204.88 & 1104.88 & 3.0433 & 0.3566 \\ \phantom{0} 8 & 0.043945 & 296.21 & 1196.21 & 3.0778 & 0.1353 \\ \phantom{0} 9 & 0.009766 & 428.26 & 1328.26 & 3.1233 & 0.0305 \\ 10 & 0.000977 & 619.17 & 1519.17 & 3.1816 & 0.0031 \\ & & & & & 3.0059 \end{array}
Since your starting utility was $3,$ this is an expected gain.
Note that one of the assumptions of the research on which the Scientific American article is based is that you're not just taking $\$100$ out of your bank account to play the game; you are effectively putting your entire wealth on the table. That's what makes the losses in that game so devastating. Compare this to a lottery where you have a $0.1\%$ chance to win a large multiple of the ticket price and a $99.9\%$ chance to win nothing. A lottery like this where the price of a single ticket is "everything you own" is a very different matter from a lottery where a ticket costs a dollar.
If you start with an initial bet $b$, the expected value for the amount on the table after $n$ fair coin tosses is
$$B_n=b\left(p^n+{n\choose1}p^{n-1}q+{n\choose2}p^{n-2}q^2+\cdots+{n\choose n-1}pq^{n-1}+q^n \right)$$
where $p=1.2/2$ and $q=0.83/2$ By the Binomial Theorem, this is
$$B_n=b(p+q)^n=b(2.03/2)^n=(1.015)^nb$$
which clearly goes to infinity as $n\to\infty$. Note, however, that
$$\begin{align} (2p)^{n-k}(2q)^k=(1.2)^{n-k}(0.83)^k\ge1 &\iff(n-k)\log(1.2)+k\log(0.83)\ge0\\ &\iff k\le {n\log(1.2)\over\log(1.2)-\log(0.83)}\approx0.49456\ldots n \end{align}$$
which means that the probability that a player winds up with a profit is less than one half. This accords with what the Scientific American article is actually about, namely the "inevitability" of wealth inequality: In this game, most people lose money, some make a little money, and a few make a lot of money. The OP's confusion seems to stem from reading a discussion of the article where people incorrectly paraphrased what the article says, rather than the article itself. The Scientific American article uses the image of a casino only to describe the rules of the game; it doesn't claim the game is a money maker for the casino.
In your second approach, you are only comparing games that end up with an equal amount of winning as losing tosses. That's not entirely fair.
For instance, consider playing the game with two flips. Winning twice (total gain of $\$44$) hits a lot harder than losing twice (total loss of $\$31$), and this difference more than makes up for the tiny loss in case you get one win and one loss (total loss of $\$0.40$).
This holds up when increasing the number of tosses. For instance, for ten flips, six wins and four losses yield a net gain of $\$41$, while six losses and four wins only loses you $\$32.30$. The corresponding differences between the 7-3, 8-2, 9-1 and 10-0 results are even larger.
To answer "should you play this game" you should calculate the probability of each result. All that matters is the number of wins and losses. As you observe, if you win an equal number of times as you lose you will come out behind. With $10$ flips you will be down about $2\%$. The expected value is positive because you will be ahead much more if you get more than $5$ heads than you will be behind if you get less than $5$ heads.
The casino will lose money over time. For every two dollars bet they will pay out $0.2$ and take in $0.17$ for a loss of $0.03$.
People's reactions to winning and losing money are not linear. If the stake were one dollar I would find it inconsequential. Even if you get ten heads you only win a little over $5$. Whether to play depends on whether you enjoy it-the money is not the thing. As the stake rises you get into a regime where you are attracted by the positive expectation while not being scared off too much of the possibility of losing. When it rises more you can lose enough to matter to your life and probably you become risk averse and decline.