Question about connection between Poisson and Gamma distributions

Assuming $X\sim\mathcal{P}(\lambda)$ and $Y\sim\Gamma(w,1)$ prove that $P(X\ge w)=P(Y\le \lambda)$. How this fact is lead from the connection between the poisson and exponential distributions?

I don't know from where to start. poisson is defined only for discrete situations but the exponential is only for continious situation. How can I prove the fact ?
EDIT: for gamma distribution I wrote that $f_Y(y)=\frac{y^{w-1}e^{-y}}{\Gamma(w)}$, but I have problem with integrating it. About Poisson: its function is $\displaystyle \sum _{w_i=0}^w P(X=w_i)$ which I don't know how to sum into a final expression. How can I continue?


Solution 1:

Let $w$ be an integer. Then $$ P(X\geqslant w)=1-P(X\leqslant w-1)=1-e^{-\lambda}\sum_{k=0}^{w-1}\frac{\lambda^k}{k!}. $$ Now, we use that if $\Gamma(a,b)$ denotes the incomplete Gamma function, i.e. $$ \Gamma(a,b)=\int_b^\infty t^{a-1}e^{-t}\,\mathrm dt, $$ then $$ \Gamma(n,b)=(n-1)!e^{-b}\sum_{k=0}^{n-1}\frac{b^k}{k!} $$ provided that $n$ is an integer. Thus (recall that $\Gamma(n)=(n-1)!$ for $n$ integer) $$ P(X\geqslant w)=1-\frac{\Gamma(w,\lambda)}{\Gamma(w)}. $$ Rewriting this expression we arrive at the desired expression $$ \begin{align} P(X\geqslant w)&=\frac{\Gamma(w)-\Gamma(w,\lambda)}{\Gamma(w)}=\frac{1}{\Gamma(w)}\left(\int_0^\infty t^{w-1}e^{-t}\,\mathrm dt-\int_\lambda^\infty t^{w-1}e^{-t}\,\mathrm dt\right)\\ &=\frac{1}{\Gamma(w)}\int_0^\lambda t^{w-1}e^{-t}\,\mathrm dt=P(Y\leqslant\lambda). \end{align} $$

Solution 2:

Assuming $X\sim\mathcal{P}(\lambda)$ and $Y\sim\Gamma(w,1)$ prove that $P(X\ge w)=P(Y\le \lambda)$.

Suppose the number of occurrences during a time interval of length $\lambda$ is a Poisson-distributed random variable with expected value $\lambda$, and the numbers of occurrences in disjoint time intervals are independent.

Then the number of occurrences before time $\lambda$ is zero if and only if the waiting time until the first occurrence is more than $\lambda$.

The probability that a $\operatorname{Poisson}(\lambda)$-distributed random variable is $0$ is $e^{-\lambda}$.

Therefore the probability that the waiting time until the first occurrence is more than $\lambda$ is $e^{-\lambda}$.

In other words, the waiting time has an exponential distribution with expected value $1$.

If you know that the sum of independent exponentially distributed random variables is Gamma-distributed, then you've got it.

Punch line: The number of occurrences before time $\lambda$ is at least $w$ if and only if the waiting time until the $w$th occurrence is less than $\lambda$. (And since it's a continuous distribution, that's the same as the probability that it's less than or equal to $\lambda$.)

Solution 3:

Oh right... this problem justifies the intuition that connects Poisson with Exponential.

If you want the raw way of doing it, which at this early stage might be the only option anyway, look at the pdf for the Gamma function and just compute directly the probability of being $\leq \lambda$. Repeated integration by parts reveals the Poisson distribution formula that you want.

Also should say, the interpretation of this formula is the beautiful part. If you imagine being a store owner and suppose customers arrive in your store, and the time you wait between arrivals is exponentially distributed with rate $1$. Then you can ask, what is the distribution of the number of customers that arrive within a time interval $\lambda$?


A sample calculation for $w=2$:

\begin{align} P(Y_2 \leq \lambda) &= \frac{1}{1!} \int_0^\lambda xe^{-x} \, dx\\ &= \left[xe^{-x}\right]_0^\lambda - \int_0^\lambda e^{-x} \, dx \\ &= 1 - \lambda e^{-\lambda} - e^{-\lambda} \\ &= P(X \geq 2) \end{align}

I suppose at some point with this approach you'd have to use induction.

Solution 4:

I don't find the currently accepted answer particularly satisfying since it seems almost circular in assuming a formula that is essentially the answer....

Just do a single integration by parts:

$\frac{1}{(w-1)!} \int_0^\lambda x^{w-1} e^{-x} dx = -\lambda^{w-1}e^{-\lambda}/(w-1)!+ \frac{1}{(k-2)!} \int_0^\lambda x^{w-2} e^{-x} dx$

Gives

$P(Y_w <= \lambda) = P(Y_{w-1} <= \lambda) - P(X = w-1)$

Use induction to finish.