Moments bounds VS Chernoff bounds

I have to prove that, when bounding tail probabilities of a nonnegative random variable, the moments method is always better than the classical Chernoff method. In mathematical language, I have to prove that for every $t>0$, $$\inf_{n \in \mathbb{N}} \frac{\mathbf{E}[X^n]}{t^n} \leqslant \inf_{s>0} \frac{\mathbf{E}[e^{sX}]}{e^{st}}$$ This result seems to be classical and can apparently be found in a paper by Phillips and Nelsen (1995) available here for the modic sum of 14$. So instead of buying it, I tried to work it out, but the proof I have seems to be rather short for a 5-pages paper... Can somebody check my proof ?

Let $s>0$. Then using power series expansion and monotone convergence theorem, we have $\mathbf{E}[e^{sX}] = \mathbf{E}\left[\sum_{n \geqslant 0}\frac{s^n X^n }{n!} \right] = \sum_n \frac{s^n \mathbf{E}[X^n]}{n!} $. Therefore,

$$ \frac{\mathbf{E}[e^{sX}]}{e^{st}} = \frac{\sum_n \frac{s^n \mathbf{E}[X^n]}{n!}}{\sum_n \frac{s^n tX^n}{n!}} = \lim_{n \to \infty} \frac{1 + s \mathbf{E}[X] + ... + \frac{s^n \mathbf{E}[X^n]}{n!}}{1 + st + ... + \frac{s^n t^n }{n!}}$$

A classical inequality (often called Cauchy's third inequality) states that for every nonnegative numbers $a_1,...,a_n,b_1,...,b_n$ one has $$ \frac{a_1 + ... + a_n}{b_1 + ... + b_n} \geqslant \min_{k \leqslant n} \frac{a_k}{b_k}$$

Therefore, $$ \frac{1 + s \mathbf{E}[X] + ... + \frac{s^n \mathbf{E}[X^n]}{n!}}{1 + st + ... + \frac{s^n t^n }{n!}} \geq \min_{k \leq n} \frac{s^k \mathbf{E}[X^k] k!}{k! s^k t^k} = \min_{k \leq n}\frac{\mathbf{E}[X^k]}{t^k}$$

And finally, taking the limit we have, for every $s>0$ : $$ \frac{\mathbf{E}[e^{sX}]}{e^{st}} \geq \inf_{n \in \mathbb{N}}\frac{\mathbf{E}[X^n ]}{n!}$$ which is what we wanted to prove.


As mentioned in comments by Did, this method works and is the method of proof used by the authors.

A open link to the paper is here. The

The paper contains several common distributions (gamma, exponential, Poisson, etc) as examples and is not entirely proof.