If the gambler's fallacy is false, how do notions of "expected number" of events work?

Imagine there is a fair spinner that could land on any number $1$ through $100$. I understand that the chance of any number appearing on the next spin is $\frac{1}{100}$, and if you spin the spinner $100$ times then the expected number of $5$'s, for example, is $1$.

However, I have the following questions:

  1. If the spinner does not land on $5$ in the first spin then this does not make the chance of getting a $5$ on the second spin any more or less likely (to assume otherwise would imply the gambler's fallacy). This means that we now must spin the spinner a total of $101$ times in order to expect exactly one $5$. Doesn't this cause some kind of infinite regress? If we never expect a $5$ on any given spin, and no $5$'s have come up thus far in the first $n$ spins, then won't it take $n+100$ spins to expect a 5? I know that at some point if we examine a group of spins then we can expect something with low probability like spinning a $5$ to occur, but it seems strange and counter-intuitive to me that though we expect a $5$ to come up in a sufficiently large group of spins, on each individual spin we do not expect a $5$. Furthermore, we do not expect any number to come up on the first spin (in that the probability of any particular number appearing is low), and yet we know for certain that some number will still come up.
  2. For there to be a greater than $50\%$ chance of spinning a $5$, we must spin the spinner $69$ times according to the following calculation: $$P(\text{Not rolling a 5 in } n \text{ spins})=\left(\frac{99}{100}\right)^n \\ \left(\frac{99}{100}\right)^n < 0.5 \\ \log_{0.99}0.5=68.967... $$ Hence, it must be spun $69$ times for there to be a greater than $50\%$ chance of there being a $5$. Why is this not $50$ spins, as we expect $0.5$ $5$'s to come up in this period? Also, I have the same question that once a $5$ does not come up, don't we have to spin it $70$ times for there to a greater than $50\%$ probability, and can't this cause the same infinite regress as described above?

But we do expect a 5 on any single spin. Or, at least, we expect $1\%$ of a 5 (if that even makes sense).

More seriously, though, on any single spin the probability of getting a 5 is low. However, that is exactly outweighed by how much smaller the wait until the next 5 becomes if you get a 5 next.

In $99\%$ of cases, you will get a not-5 on the first spin, and in those cases you are expected to spin a total of 101 times before you see your first 5 (including the first failed spin that just failed). However, in $1\%$ of cases, you spin a 5, and in those cases you are expected to spin 1 time before you get your first 5. These cancel out to give a total of 100 expected spins.

As for question 2, that's because you can have two or more 5's appear. The possibility of two or more 5's within the first 50 spins, but still an expected number of 0.5 5's means the probability of any 5 at all must be less than $0.5$: $$ 0.5=\text{Expected number of 5's}=1\cdot P(\text{one 5})+2\cdot P(\text{two 5's})+\cdots\\ \implies P(\text{at least one 5})=P(\text{one 5})+P(\text{two 5's})+\cdots<0.5 $$


It's worth noting that, as you flip the spinner, you have more information than you did before - which changes your expectations. A simpler example, without expectation, would be that if you flip a fair coin twice, there is a $1/4$ chance that both flips are heads. However, we can think about what happens after the first flip:

If our first flip lands heads, then there is now a $1/2$ chance that both flips will be. If not, there is a $0$ chance of that. Prior to the first flip, we know that there is a $1/2$ chance of landing in either of these two cases, so the total probability is $1/2\cdot 1/2 + 1/2\cdot 0=1/4$.

A similar thing happens with your example: suppose we let $X$ be the number of times $5$ comes up in $100$ spins of the spinner. Most of the time - $99/100$ times to be precise, the first spin is not $5$, and, given this, we now only have $99$ spins left, so expect $X$ to be $99/100$. On the other hand, however, if $5$ does come up, we now expect $X$ to be $1+99/100$ - and averaging these two cases with their probabilities does indeed show that we expect $X$ to be $1$ overall.

Basically, you see that if you fail to get a $5$ on the first round, then your odds of seeing a $5$ have shifted downwards - but this is perfectly balanced by the less likely event that you do see a $5$. This is the same as in your second example with probabilities - yes, as soon as we see that we didn't get a $5$, we still think we need the same number of further spins, but if we get a $5$, we only used $1$ spin which is way below what we thought we'd need - and balances things.

It's worth noting that expectation is a precise mathematical term that may not perfectly align to what you'd like it to mean intuitively. It does not say anything about the most likely event - for instance, if you flipped a fair coin, the expected number of heads is $1/2$, but that's not even a possible outcome. Expectation just says "look at this value over all possible ways things could play out. Average them, weighted according to probability."

This also tells us why the probabilities are not the expectations: if we make $50$ trials, the expected number of $5$'s being $1/2$ could equally well mean "There is a $99/100$ chance that there were no $5$'s, but there's a $1/100$ chance that there were $200$ instances of $5$" or "There is a $1/2$ chance that there were no $5$'s and a $1/2$ chance that there was one five" - with the truth in this case lying in between those two somewhat absurd cases. Basically, cases where there are lots of $5$'s get counted disproportionately, where probability would count them equally to the case where there is just one $5$.

As for the paradox that no number is likely, but some number always exists, this is the same deal for probability: here are two variants of a game you might play:

Guess a number. Spin the wheel. You win if they are equal.

In this game, you will only win with probability $1/100$ because you have no information. The low probability measures this game. A related game is the following:

Spin the wheel. Guess a number. You win if they are equal.

This game you can always win because you just read off what number was spun! The relevant probability here is more like "What's the probability you spun a $5$, given that you spun a $5$" - which is $1$. You just need to be careful about exactly what you already know if you're dealing with probabilities - otherwise seemingly paradoxical results start to appear.