The question is the following :

$X_n$ are independent Poisson random variables, with expectations $\lambda_n$, such that $\lambda_n$ sum to infinity. Then, if $S_n = \sum_{i=1}^n X_i$, I have to show $\frac{S_n}{ES_n} \to 1$ almost surely.

How I started out on this problem, was to consider using the Borel Cantelli Lemma in a good way.

So fix an $\epsilon > 0$, and start with the quantity $\Pr \left[\frac{S_n}{ES_n} > 1 + \epsilon \ \textrm{i.o.}\right] $, where i.o. means infinitely often, which I want to show is $0$ (beyond this point, I will drop i.o.). This simplifies to $\Pr [S_n > (1+\epsilon) ES_n]$. So by the Borel Cantelli Lemma, if $\sum_{i=1}^\infty \Pr[X_n > (1+\epsilon) \lambda_n]$ is bounded, we are done. The inner term has a form we know to be $e^{-(1+\epsilon) \lambda_n}$, hence resulting in the quantity $\sum_{n=1}^\infty e^{-(1+\epsilon)\lambda_n}$. Now, I am stuck as I do not know how to show that this converges.


Lemma 1 : Let $(Y_n)_n$ be independent random variables with $\forall n, E(Y_n)=0$ and $E(Y_n^2)<\infty$. If there exists a sequence $a_n$ of positive real numbers that increases to $\infty$ such that $\sum_{n\geq 1}\frac{E(Y_n^2)}{a_n^2}$ converges, then $\frac{1}{a_n}\sum_{k=1}^n Y_k \to 0$ almost surely.

Proof: this is Theorem 2 (credited to Kolmogorov) in Chapter 4, Section 3 of Shiryaev's Probability. The proof relies on Kronecker's lemma.


Define $Y_n=X_n-E(X_n)$. Note that $E(Y_n)=0$ and $E(Y_n^2)=V(X_n)=\lambda_n<\infty$. Let $a_n=\sum_{k=1}^n\lambda_k$, so that $a_n$ is positive and increases to $\infty$.

It suffices to prove the convergence of $\displaystyle \sum_{n\geq 1}\frac{\lambda_n}{\left(\sum_{k=1}^n\lambda_k\right)^2}$.

Lemma 2: Let $u_n$ be a sequence of positive numbers such that $\sum_{n=1}^\infty u_n = \infty$. Then $\displaystyle \sum_{n\geq 1}\frac{u_n}{\left(\sum_{k=1}^nu_k\right)^2}$ converges.

Proof: Let $S_n=\sum_{k=1}^n u_k$. Note that $\frac{u_n}{S_n^2}=\frac{S_n-S_{n-1}}{S_n^2}\leq \int_{S_{n-1}}^{S_n}\frac{1}{t^2}dt$, hence $\sum_{k=2}^n \frac{u_k}{S_k^2}\leq \int_{S_1}^\infty \frac{1}{t^2}dt <\infty$.


Lemma 2 and Lemma 1 yield $\frac{1}{a_n}\sum_{k=1}^n Y_k \to 0$ almost surely, which rewrites as $\frac{S_n-E(S_n)}{E(S_n)} \to 0$, hence $\frac{S_n}{E(S_n)} \to 1$ almost surely.


You can prove this using the strong law of large numbers.

Just for convenience, we formulate the problem in terms of Poisson Processes.

Indeed, let $S_1,S_2,...$ be i.i.d. exponential random variables of rate $1$. Let $N(t)$ be the associated Poisson Process, i.e., $N(t)=\inf \{n \in \Bbb Z_{\geq 0}: S_1+...+S_n \geq t\}$. On the event $\{N(t) \neq 0\}$, we have $$\frac{1}{N(t)}\sum_{j=1}^{N(t)-1}S_j\leq \frac{t}{N(t)} \leq \frac{1}{N(t)}\sum_{j=1}^{N(t)}S_j$$

Using standard results, we know that $\frac{1}{n}(S_1+...+S_n) \stackrel{n \to \infty}{\longrightarrow} 1$ almost surely, and that $N(t) \stackrel{t \to \infty}{\longrightarrow} \infty$ almost surely. So we can conclude that $\frac{1}{N(t)}\sum_{j=1}^{N(t)}S_j \stackrel{t \to \infty}{\longrightarrow} 1$ almost surely. Similarly, we have that $\frac{1}{N(t)}\sum_{j=1}^{N(t)-1}S_j = \frac{N(t)-1}{N(t)}\bigg(\frac{1}{N(t)-1}\sum_{j=1}^{N(t)-1}S_j\bigg) \stackrel{t \to \infty}{\longrightarrow} 1$ almost surely. By the squeeze theorem, we conclude that $t/N(t) \to 1$ almost surely, and therefore $N(t)/t \to 1$ almost surely as well.

Since $\sum_n \lambda_n=+\infty$, we conclude that $$\frac{N(\lambda_1+...+\lambda_n)}{\lambda_1+...+\lambda_n}\stackrel{n \to \infty}{\longrightarrow} 1,\;\;\;\;\;\;\; a.s.$$ which is the same as your result (since $N(t)$ has independent increments, with $N(t)-N(s) \sim$ Poisson$(t-s)$).