Why do we say "almost surely" in Probability Theory?

Solution 1:

In terms of the sample space of events $Ω$, an event $E$ happens almost surely if $P(E) = 1$, whereas an event happens surely if $E=Ω $.

An example: suppose we are (independently) flipping a (fair) coin infinitely many times. The event

$$\{ \text{I will get heads infinitely often}\}$$ is an almost sure event (because it is possible get only a finite number of heads...but how likely is that? Rigorous proof uses Borel Cantelli, if you are interested)

In contrast, $$\{\text{ I will get heads or tails on my 16th flip} \}$$ must happen. This is a sure event.

Solution 2:

There is a difference between "almost surely" and "surely."

Consider choosing a real number uniformly at random from the interval $[0,1]$. The event "$1/2$ will not be chosen" has probability $1$, but is not impossible.

I recommend reading the relevant Wikipedia article, which I found very clarifying when I was learning probability.

Solution 3:

Subtle difference is present.

If some event occurs, then it has probability measure $1$. But if some event occurs with probability $1$, then the event need not occur, and the event that the event does not occur has probability measure $0$.

Note that probability is not about what happenED, for if so then we do not need probability theory. Probability is about what may happen.

So it is not that a random walk must go back to the origin, but that it will go back to the origin with probability $1$.

You may be referred to the original, axiomatic treatment of probability by A. Kolmogorov