Rolling a die with n sides to get a cumulative score of n

Solution 1:

If we divide every roll by $n$, rolling the die and dividing by $n$ approximates the uniform distribution on $[0,1]$ for arbitrarily large $n$. We then are looking for the expected number of samples from a uniform distribution required to get a sum above $1$.

For any integer $k \in \mathbb{N}$, Let $X_1, \ldots , X_k, \ldots$ be the random variables in question. Then $$ \mathbb{P}[X_1 + \ldots + X_k \leq 1] = \frac{1}{k!} $$ as $\frac{1}{k!}$ is the volume of the $k$-dimensional simplex defined by $X_1 + \ldots + X_k \leq 1$. Now, let $p_k$ be the probability that it takes exactly $k$ rolls to get above $1$; this is equal to $$ p_k = \mathbb{P}[X_1 + \ldots + X_{k-1} \leq 1] - \mathbb{P}[X_1 + \ldots + X_{k} \leq 1] = \frac{1}{(k-1)!} - \frac{1}{k!},$$ as we need the first $k-1$ samples to sum below $1$, and need the first $k$ to sum above $1$. Thus, the expected number of rolls required is \begin{align} \mathbb{E}[\text{Number of Rolls Required}] &= \sum\limits_{k = 1}^\infty k\cdot p_k \\ &= \sum\limits_{k = 1}^\infty k \left( \frac{1}{(k-1)!} - \frac{1}{k!}\right) \\ &= \sum\limits_{k = 2}^\infty k \left( \frac{1}{(k-1)!} - \frac{1}{k!}\right) &\text{ as the }k=1\text{ case contributes }0 \\ &= \sum\limits_{k = 2}^\infty \frac{1}{(k - 2)!}\\ &= e. \end{align}

Solution 2:

No real answer, but a way to approach the problem and also to find solutions for small $n$.

Let $\mu_{k}$ denote the expectation of the number of rolls needed to arrive at a score of at least $k$.

This for $k\in\mathbb{Z}$.

Then:

  • $\mu_{k}=0$ if $k\leq0$

  • $\mu_{k}=\frac{1}{n}\left(1+\mu_{k-1}\right)+\frac{1}{n}\left(1+\mu_{k-2}\right)+\cdots+\frac{1}{n}\left(1+\mu_{k-n}\right)=1+\frac{1}{n}\sum_{i=1}^{n}\mu_{k-i}$ if $k>0$.

To be found is $\mu_n$ and we have a recurrence relation.

Edit:

Based on this it can be shown that $\mu_k=(1+\frac1{n})^{k-1}$ for $0<k\leq n$, hence $\mu_n=(1+\frac1{n})^{n-1}\rightarrow e$. See the comments on this answer for that.

Credit for that goes to Byron, and Marcus spared me some work too.

Solution 3:

I hope that this is valid (my memory of math is faded from 20 years ago):

Let $f(x)$ be the average number of uniformly distributed reals between $0$ and $1$ to add up to at least $x$ where $0 \leq x \leq 1$. Then the problem is reduced to finding $f(1)$.

Let's try to determine $f(x)$. Since the distribution is uniform, there is $1-x$ chance to hit it on the first try. To calculate the average for the case where we don't hit it on the first try, we must integrate $\int_{0}^{x}(1+f(x-y))dy$. In the integral, $1$ is the one try we just consumed and $f(x-y)$ is the average number of tries to cover the remaining $x-y$

So, taken together, we can determine that

$$ f(x) = (1-x) + \int_{0}^{x}(1+f(x-y))dy $$

By substituting $z = x-y$, we can simplify the right-hand side:

$$f(x) = 1 + \int_{0}^{x}f(z)dz$$

Differentiating both sides gives us $f'(x) = f(x)$ and with a boundary condition of $f(0) = 1$, we find that $$f(x) = e^x$$

So it takes an average of $e^x$ tries to add up to any $x$ between $0$ and $1$ inclusive. $e^1 = e$, so it takes an average of $e$ tries to add to $1$.