Connection between the Gamma function and gamma distribution

The Gamma function is a generalization of the factorial ($\Gamma(n)=(n-1)!$). It is also the normalizing constant for the Gamma distribution and this happens to be (when its rate parameter is $1$) the time of the $n$-th event in a standard Poisson process (with rate $1$). Is there any connection between these two facts? Perhaps relating to the fact that if we switched the order of the events in the Poisson process, the time until the $n$-th event wouldn't change?


Solution 1:

Sure. Call $T_n$ the time of the $n^{th}$ jump of a rate $1$ Poisson process, which we will call $N(t)$. By independence of the exponentials (or the Markov property), we have

$$\mathbb{P}(T_{n+1} \in (t,t+\delta)) =\mathbb{P}(N(t) = n, T_{n+1} \in (t,t+\delta)) + O(\delta^2)$$.

By the memoryless property of the exponential (or the Markov property of the Poisson process),

$$\mathbb{P}(N(t) = n, T_{n+1} \in (t,t+\delta)) = \frac{t^n}{n!}e^{-t}(1-e^{-\delta})$$.

Divide by $\delta$ and send $\delta \to 0$ to obtain that the density of $T_{n+1}$ is $\frac{t^n e^{-t}}{n!}1_{\{t>0\}}$. It then follows from the fact that this is a density function that $\int_0^\infty t^n e^{-t} dt = n!.$

Your question then boils down to why is it the case that the mass function of a rate 1 Poisson process is $\mathbb{P}(N(t)=n) = \frac{t^n}{n!}e^{-t}$. One way to derive this is to subdivide $[0,t]$ into $M$ intervals of length $\frac{t}{M}$. Assign to each interval a Bernoulli random variable which is equal to $1$ if there is a jump of the Poisson process in that interval and then sum these Bernoullis. By independence of the exponentials, the Bernoullis are independent. There is a finite $M$ after which the sum has stabilized to be equal to the Poisson, so we have a.s. convergence, which implies convergence in distribution. Because an interval of length $\frac{t}{M}$ is very unlikely to have more than one jump in it, the success probability of each Bernoulli is $\frac{t}{M} + O(\frac{1}{M^2})$. The sum is then approximately distributed as Binomial(M,$\frac{t}{M}$). An elementary computation which you may know as the law of rare events then shows that the pmf of $N(t)$ is as described. In this interpretation, the factorial can be seen to come from the permutation invariance of the locations of the jumps.