Is it possible to characterize completeness of a normed vector space by convergence of Neumann series?
If $X$ is a normed vector space and if for each bounded operator $T \in B(X)$ with $\| T\| < 1$, the operator ${\rm id} - T$ is boundedly invertible, does it follow that $X$ is complete?
Context:
It is well known that if $X$ is a Banach space and if $T \in B(X) = B(X,X)$ is a bounded linear operator on $X$ with $\| T \| <1$, then the Neumann series $\sum_{n=0}^\infty T^n$ converges (in the operator norm) to $({\rm id} - T)^{-1}$. In particular, ${\rm id} - T$ is invertible.
There are counterexample to this fact if we do not assume $X$ to be complete. For example, we can take $X = \ell_0 (\Bbb{N})$ (the finitely supported sequences) and $T = \frac{1}{2} S$, where $S$ is the right shift operator. In this case, it is easy to see that $\sum_{n=0}^\infty T^n$ does not converge to a well-defined operator from $X$ to $X$.
After I came up with the above counterexample, I wondered if we can characterize completeness of the normed vector space $X$ by the above property, as in the question stated above.
Thoughts on the problem:
Equivalently, we could require that $\sum_{n=0}^\infty T^n$ converges to a well-defined operator from $X\to X$ as soon as $\|T\|<1$, since in the completion $\overline{X}$, we still know that $T$ extends to a contiuous linear operator $\overline{T} : \overline{X} \to \overline{X}$ with $\| \overline{T} \| = \| T\|<1$, so that $S := {\rm id_{\overline{X}}} - \overline{T}$ is invertible with $S^{-1} = \sum_{n=0}^\infty \overline{T}^n$ and the restriction of $S^{-1}$ to $X$ is the inverse of ${\rm id} - T$, so that $({\rm id}_X - T)^{-1} = \sum_{n=0}^\infty T^n$.
I know that $X$ is complete iff $B(X)$ is, so that it would suffice to show that $B(X)$ is complete.
To show that a normed vector space $Y$ is complete, it suffices to show that "absolute convergence" of a series implies convergence, or even more restrictive that if $\|x_n\|\leq 2^{-n}$ for all $n$, then the series $\sum_{n=1}^\infty x_n$ converges in $Y$.
-
My problem with applying observation 3 to $Y = B(X)$ is that we only know that the statement for 3 is true for $x_n = T^n$ with suitable $T$, which seems to be too restrictive.
In fact, I don't know how to construct any kind of nontrivial bounded operators on a general normed vector space $X$, apart from operators of the form $x \mapsto \varphi(x) \cdot x_0$ (and linear combinations of those), where $\varphi $ is a bounded functional on $Y$ and $x_0 \in Y$.
But for operators as above (i.e. with finite dimensional range), convergence of the series $\sum_{n=0}^\infty T^n$ is always true, since in fact we only need to consider a finite dimensional subspace, which certainly is complete.
Solution 1:
The answer is "No". Take your favorite infinite-dimensional Banach space $Y$ and choose (alas, this requires AC, so if you are not a believer, stop reading here) some discontinuous linear functional $\psi$ on $Y$. Let $X=Ker(\psi)$ with the norm inherited from $Y$. Assume now that $A:X\to X$ has norm less than $1$. Then it extends by continuity to $Y$ and enjoys the property $AX\subset X$, so $\psi\circ A$ vanishes on $X=Ker(\psi)$. We want to show that if $y-Ay=x\in X$, then necessarily $y\in X$. Assume not. Then $\psi(y)\ne 0$ and $(\psi\circ A)(y)=\psi(y)$. Hence the linear functionals $\psi\circ A$ and $\psi$ coincide on the entire space $Y$, so $\psi(z-Az)=0$ for all $z\in Y$, which is absurd because $z-Az$ is just an arbitrary element of $Y$ and $\psi$ is certainly not identically $0$.
The trick is, of course, that it is not so easy for a continuous linear operator to preserve the kernel of a discontinuous linear functional, so the operators acting from $X$ to $X$ are rather few in a certain sense, which makes the Neumann series convergence condition vacuous exactly when it starts getting interesting.