Numerical phenomenon. Who can explain?

I was doing some software engineering and wanted to have a thread do something in the background to basically just waste CPU time for a certain test.

While I could have done something really boring like for(i < 10000000) { j = 2 * i }, I ended up having the program start with $1$, and then for a million steps choose a random real number $r$ in the interval $[0,R]$ (uniformly distributed) and multiply the result by $r$ at each step.

  • When $R = 2$, it converged to $0$.
  • When $R = 3$, it exploded to infinity.

So of course, the question anyone with a modicum of curiosity would ask: for what $R$ do we have the transition. And then, I tried the first number between $2$ and $3$ that we would all think of, Euler's number $e$, and sure enough, this conjecture was right. Would love to see a proof of this.

Now when I should be working, I'm instead wondering about the behavior of this script.

Ironically, rather than wasting my CPUs time, I'm wasting my own time. But it's a beautiful phenomenon. I don't regret it. $\ddot\smile$


Solution 1:

EDIT: I saw that you solved it yourself. Congrats! I'm posting this anyway because I was most of the way through typing it when your answer hit.

Infinite products are hard, in general; infinite sums are better, because we have lots of tools at our disposal for handling them. Fortunately, we can always turn a product into a sum via a logarithm.

Let $X_i \sim \operatorname{Uniform}(0, r)$, and let $Y_n = \prod_{i=1}^{n} X_i$. Note that $\log(Y_n) = \sum_{i=1}^n \log(X_i)$. The eventual emergence of $e$ as important is already somewhat clear, even though we haven't really done anything yet.

The more useful formulation here is that $\frac{\log(Y_n)}{n} = \frac 1 n \sum \log(X_i)$, because we know from the Strong Law of Large Numbers that the right side converges almost surely to $\mathbb E[\log(X_i)]$. We have $$\mathbb E \log(X_i) = \int_0^r \log(x) \cdot \frac 1 r \, \textrm d x = \frac 1 r [x \log(x) - x] \bigg|_0^r = \log(r) - 1.$$

If $r < e$, then $\log(Y_n) / n \to c < 0$, which implies that $\log(Y_n) \to -\infty$, hence $Y_n \to 0$. Similarly, if $r > e$, then $\log(Y_n) / n \to c > 0$, whence $Y_n \to \infty$. The fun case is: what happens when $r = e$?

Solution 2:

I found the answer! One starts with the uniform distribution on $ [0,R] $. The natural logarithm pushes this distribution forward to a distribution on $ (-\infty, \ln(R) ] $ with density function given by $ p(y) = e^y / R, y \in (-\infty, \ln(R)] $. The expected value of this distribution is $$ \int_{-\infty}^{\ln(R)}\frac{y e^y}{R} \,\mathrm dy = \ln(R) - 1 .$$ Solving for zero gives the answer to the riddle! Love it!