On Cesàro convergence: If $ x_n \to x $ then $ z_n = \frac{x_1 + \dots +x_n}{n} \to x $

I have this problem I'm working on. Hints are much appreciated (I don't want complete proof):

In a normed vector space, if $ x_n \longrightarrow x $ then $ z_n = \frac{x_1 + \dots +x_n}{n} \longrightarrow x $

I've been trying adding and subtracting inside the norm... but I don't seem to get anywhere.

Thanks!


Solution 1:

Given $ \epsilon >0$ there exists $ n_0 $ such that if $ n\geq n_0 $ then $\parallel x_n -x\parallel < \epsilon $

so

\begin{align*} 0 & \leq \left\lVert \frac{x_1 +\cdots +x_n}{n} -x \right\rVert \leq \left\lVert \frac{x_1 + \dots + x_n - nx }{n} \right\rVert \\ & \leq \frac{\lVert x_1 - x \rVert}{n} + \dots + \frac{\lVert x_{n_0 - 1} - x \rVert}{n} + \frac{\lVert x_{n_0} - x \rVert}{n} +\dots + \frac{\lVert x_{n} - x \rVert}{n} \\ &\le \frac 1n\sum_{i=1}^{n_0-1} \| x_i -x\| + \frac{n-n_0}{n} \epsilon \end{align*}

The first $n_0 -1$ terms $\| x_i -x\|$ can be bounded by some $M$, thus for $n\ge (n_0-1)M/\epsilon=: N_0$ we have $$\frac 1n\sum_{i=1}^{n_0-1} \| x_n -x\| \le \frac 1n (n_0-1)M \le \epsilon$$

Thus $$\left\| \frac{x_1 + \cdots x_n}{n} - x\right\| <2\epsilon$$ when $n\ge N_0$.

Thanks a lot @Leonid Kovalev for the inspiration, though my main problem was that I wasn't aware of what to do with the $nx$ (the silliest part :P)

Solution 2:

There is a slightly more general claim:

PROP Let $\langle a_n\rangle$ be a sequence of real numbers, and define $\langle \sigma_n\rangle$ by $$\sigma_n=\frac 1 n\sum_{k=1}^n a_k$$

Then $$\liminf_{n\to\infty}a_n\leq \liminf_{n\to\infty}\sigma_n \;(\;\leq\;)\;\limsup_{n\to\infty}\sigma_n\leq \limsup_{n\to\infty}a_n$$

P We prove the leftmost inequality. Let $\ell =\liminf_{n\to\infty}a_n$, and choose $\alpha <\ell$. By definition, there exists $N$ such that $$\alpha <a_{N+k}$$ for any $k=0,1,2,\ldots$ If $m>0$, then $$m\alpha <\sum_{k=1}^m \alpha_{N+k}$$

which is $$m\alpha<\sum_{k=N+1}^{N+m}a_k$$

$$(m+N)\alpha+\sum_{k=1}^{N}a_k<\sum_{k=1}^{N+m}a_k+N\alpha$$

which gives

$$\alpha+\frac{1}{m+N}\sum_{k=1}^{N}a_k<\frac{1}{m+N}\sum_{k=1}^{N+m}a_k+\frac{N}{m+N}\alpha$$

Since $N$ is fixed, taking $\liminf\limits_{m\to\infty}$ gives $$\alpha \leq \liminf\limits_{m \to \infty } \frac{1}{m}\sum\limits_{k = 1}^m {{a_k}} $$ (note that $N+m$ is just a shift, which doesn't alter the value of the $\liminf^{(1)}$). Thus, for each $\alpha <\ell$, $$\alpha \leq \liminf\limits_{m \to \infty } \frac{1}{m}\sum\limits_{k = 1}^m {{a_k}} $$ which means that $$\liminf_{n\to\infty}a_n\leq \liminf_{n\to\infty}\sigma_n$$ The rightmost inequality is proven in a completely analogous manner. $\blacktriangle$.

$(1)$: Note however, this is not true for "non shift" subsequences, for example $$\limsup_{n\to\infty}(-1)^n=1$$ but $$\limsup_{n\to\infty}(-1)^{2n+1}=-1$$

COR If $\lim a_n$ exists and equals $\ell$, so does $\lim \sigma_n$, and it also equals $\ell$. The converse is not true.

Solution 3:

WLOG, the $x_n$ converge to $0$ (otherwise consider the differences $x_n-x$), and stay confined in an $\epsilon$-neighborhood of $0$ after $N_\epsilon$ terms.

Then the average of the first $m$ terms is bounded by

$$\frac{N\overline{x_N}+(m-N)\epsilon}m,$$ which converges to $\epsilon$. So you can make the average as close to $0$ as you like.