Convergence of Average of Partial Sums
Consider a numerical series $\sum_{n=1}^\infty a_n$. Denote by $s_n = \sum_{k=1}^n a_k$ the partial sums. Let the average of the first $n$-partial sums be $\sigma_n = \frac{s_1 + s_2 + \cdots + s_n}{n}.$ If $\{\sigma_n\}$ converges to $\sigma_\infty$, and $a_k = o(\frac{1}{k})$ as $k\to\infty$, prove that $\{s_n\}$ also converges to $\sigma_\infty$. (Hint: express the difference $s_n - \sigma_n$ using $\{ka_k: k=1,\cdots, n\}$.) Could somebody provide an example proof and also explain what small o means in the context of $k\to\infty?$ I've seen it before in looking at say $x\to x_0$ but not in the context of it going to infinity. Also, can the same result of $\{s_n\}\to\sigma_\infty$ be true if we have $a_k=O(\frac{1}{k})?$
First of all $a_k = o(\frac{1}{k})$ means that $lim_{k\to+\infty}{\frac{a_k}{(\frac{1}{k})}} = 0$ wich is the same as saying that
$lim_{j\to+\infty}{ja_j} = 0$
It will be useful this proposition wich I think is called "Dirchlet Inversion Formula" wich states the following : $\sum_{i=1}^n{\sum_{j=1}^i{a_{i,j}}} = \sum_{j=1}^n{\sum_{i=j}^n{a_{i,j}}}$
wich in the case where $a_{i,j}$ depends only on j becomes :
$\sum_{i=1}^n{\sum_{j=1}^i{a_{j}}} = \sum_{j=1}^n{\sum_{i=j}^n{a_{j}}}$
Now we have all we need for the actual proof :
$$\sigma _n = \frac{\sum_{i=1}^n{s_i}}{n} = \frac{\sum_{i=1}^n{\sum_{j=1}^i{a_j}}}{n} = \frac{\sum_{j=1}^n{\sum_{i=j}^{n}{a_j}}}{n} = \frac{\sum_{j=1}^n{(n-j+1)a_j}}{n} = \frac{(n+1)\sum_{j=1}^n{a_j}-\sum_{j=1}^n{ja_j}}{n}=\frac{n+1}{n}\sum_{j=1}^n{a_j}-\frac{\sum_{j=1}^n{ja_j}}{n}= \frac{n+1}{n}s_n - \frac{\sum_{j=1}^n{ja_j}}{n}$$
from the equalities above we got that $$\sigma_n= \frac{n+1}{n}s_n - \frac{\sum_{j=1}^n{ja_j}}{n}$$
by subtracting $s_n$ both sides, taking the absolute values and using the triangle inequality we get that $$|s_n - \sigma_n| \leq \frac{|s_n|}{n} + \left|\frac{\sum_{j=1}^n{ja_j}}{n}\right|$$
Using the triangle inequality once again we find that $$|s_n - \sigma_{\infty}| \leq \frac{|s_n|}{n} + \left|\frac{\sum_{j=1}^n{ja_j}}{n}\right| + |\sigma_n - \sigma_{\infty}|$$
if we prove that all the things on the right hand side tends to 0 as n tends to $+\infty$ then we are done (you can use the squeeze theorem or the $\epsilon - \delta$ definition of limit to finish)
to do it it's useful the so called Cesaro-Means-Theorem in particular if we apply the teorem to the sequences $a_k$ and $ja_j$ we find that
$$lim_{n\to+\infty}{\frac{\sum_{j=1}^n{ja_j}}{n}} = 0$$ $$lim_{n\to+\infty}{\frac{s_n}{n}} = lim_{n\to+\infty}{\frac{\sum_{k=1}^n{a_k}}{n}} = 0$$
(note that $\lim_{n\to+\infty}{a_n} = \lim_{n\to+\infty}{\frac{na_n}{n}} = \frac{\lim_{n\to+\infty}{na_n}}{\lim_{n\to+\infty}{n}} = \frac{0}{+\infty} = 0$)
so for what I just proved and because by assumption $\sigma_n \to \sigma_{\infty}$ all the terms on the right hand side of the inequality tends to 0 (the absolute values doesn't creates any problem because if something tends to 0 then it's absolute value also tends to 0) and therefore the proof is done.