Average number of times it takes for something to happen given a chance

Given a chance between 0% and 100% of getting something to happen, how would you determine the average amount of tries it will take for that something to happen?

I was thinking that $\int_0^\infty \! (1-p)^x \, \mathrm{d} x$ where $p$ is the chance would give the answer, but doesn't that also put in non-integer tries?


You are dealing with the geometric distribution. Let the probability of "success" on any one trial be $p\ne 0$. Suppose you repeat the experiment independently until the first success. The mean waiting time until the first success is $\frac{1}{p}$.

For a proof, let $X$ be the waiting time until the first success, and let $e=E(X)$ There are two possibilities. Either you get success right away (probability $p$) in which case your waiting time is $1$, or you get a failure on the first trial (probability $1-p$), in which case your expected waiting time is $1+e$, since the first trial has been "wasted." Thus $$e=(1)(p)+(1+e)(1-p).$$ Solving for $e$ we get $e=\frac{1}{p}$.

Alternately, we can set up an infinite series expression for $e$, and sum the series. The probability that $X=1$ is $p$. The probability that $X=2$ is $(1-p)p$ (failure then success). The probability that $X=3$ is $(1-p)^2p$ (two failures, then success). And so on. So $$E(X)=p+2(1-p)p+3(1-p)^2p +4(1-p)^3p +\cdots.$$
If you wish, I can describe the process of finding a simple expression for the above infinite series.

Remark: For smallish $p$, your integral turns out to be a reasonably good approximation to the mean. However, as you are aware, we have here a discrete distribution, and the right process involves summation, not integration.

Another tool that we can use is the fact that $$E(X)=\Pr(X \ge 1)+\Pr(X\ge2)+\Pr(X \ge 3)+\cdots.$$ That is the discrete version of your integral. It gives us the geometric progression $1+(1-p)+(1-p)^2+(1-p)^3+\cdots$, which has sum $\frac{1}{p}$ if $p\ne 0$.


Let $X$ the number of trials required for first success(something to happen), then $P(X=n)=(1-p)^{n-1}p$. Average number of trials requirede for first success= expectation of $X=E(X)= \sum_{1}^{\infty}n(1-p)^{n-1}p=p\sum_{1}^{\infty}n(1-p)^{n-1}=p(1/p^2)=1/p$.


Here's an easy way to see this, on the assumption that the average actually exists (it might otherwise be a divergent sum, for instance). Let $m$ be the average number of trials before the event occurs.

There is a $p$ chance that it occurs on the first try. On the other hand, there is a $1-p$ chance that it doesn't happen, in which case we have just spent one try, and on average it will take $m$ further tries.

Therefore $m = (p)(1) + (1-p)(1+m) = 1 + (1-p)m$, which is easily rearranged to obtain $m = 1/p$.