What is the expected value of the number of die rolls necessary to get a specific number?
Solution 1:
One "trick" that often lets you avoid issues of convergence when solving probability problems is to use a recursive argument.
You have a 1/6 probability of rolling a 6 right away, and a 5/6 chance of rolling something else and starting the process over (but with one additional roll under your belt).
Let $E$ be the expected number of rolls before getting a 6; by the reasoning above, we have:
$E = (1)(1/6) + (E + 1)(5/6)$
Solving for $E$ yields $E = 6$.
An alternative approach is to use the generating function. The generating function $G(t)$ for a probability distribution that only takes on integer values is defined as:
$G(t) = \Sigma_{i = 0}^{\infty} p_i t^i$
The reason the generating function comes in handy is that $G'(1)$ gives the expected value and $G''(1) + G'(1) - (G'(1))^2$ gives the variance; one can check this directly.
In our case, the generating function is:
$G(t) = (1/6)t + (1/6)(5/6)t^2 + (1/6)(5/6)^2t^3 + \ldots$
We can rewrite this as follows:
$G(t) = (1/5)(5t/6 + (5t/6)^2 + (5t/6)^3 + \ldots)$
Summing the geometric series gives $G(t) = t/(6-5t)$; from here, we can calculate $G'(t)$ and $G''(t)$, plug in $t = 1$, and use the above expressions to extract both the expected value and the variance (6 and 30, respectively).
Solution 2:
A slightly simpler recursive derivation is this. We must roll the die at least once. On the first roll we get a 6 with probability $\frac{1}{6}$. Otherwise we start again. Hence, $E = 1 + \frac{5}{6}E$, which gives $E=6$.
Here is a more general answer:
Regard rolling the die as a Bernoulli process $X_1,X_2, \ldots$, where $X_i = $ Success, with probability $p$, and $X_i = $ Failure, with probability $1-p$. The process stops after the first success.
Let $N_s$ be the length of the sequence until the first Success. This is a random integer. Then we have $$ \Pr (N_s=k) = \Pr(\underbrace{FF\cdots F}_{k-1}\ S) = \underbrace{(1-p)(1-p)\cdots(1-p)}_{k-1}\ p=(1-p)^{k-1}p=pq^{k-1}, $$ where $q=1-p$ and $k\ge1$. This is called a Geometric Distribution, which is the discrete equivalent of the Exponential Distribution. Random variables with these distributions are called memoryless. (See Ross, Introduction to Probability Models, 9th Edition, page 284.)
The expected value and variance of $N_s \sim \text{Geom}(p)$ are $$ \text{E}{N_s(p)}=\frac{1}{p}, \text{ and } \text{Var}{N_s(p)} = \frac{1-p}{p^2}. $$Proof can be found in Ross, above. Note that $$\text{E}{N_s(p)} = 1 +(1-p)\text{E}{N_s(p)}, \text{ whose solution is } \frac{1}{p}.$$
In your case $p = \frac{1}{6}\,$ and so E(No. rolls) = 6, and Var(No. rolls) = 30 -- Geom$(\frac{1}{6})$ has a long tail.
Solution 3:
Elliott's answer is surely the nicest. To sum the series, we can use a method similar to the geometric series.
Let $$S = 1 + 2 \cdot \left(\frac{5}{6}\right) + 3 \cdot \left(\frac{5}{6}\right)^2 + \cdots $$
Then
$$\frac{5}{6}S = \frac{5}{6} + 2 \cdot \left(\frac{5}{6}\right)^2 + 3 \cdot \left(\frac{5}{6}\right)^3 + \cdots $$
$$S - \frac{5}{6}S = 1+ \left(\frac{5}{6}\right) +\left(\frac{5}{6}\right)^2 + \cdots $$
This is just a geometric series and hence we have that
$$S = 36$$
Now we have that $$\frac{1}{6} \sum_{n=0}^\infty \left(\frac{5}{6}\right)^n (n+1) = \frac{1}{6} \cdot S = 6,$$ as expected.
(Convergence of the series can be seen by the ratio test)