Prob. 8, Sec. 3.5 in Erwin Kreyszig's Introductory Functoinal Anlaysis With Applications
Erwin Kreyszig's Introductory Functoinal Anlaysis With Applications
Prob. 8, Sec. 3.5
$\DeclareMathOperator{\span}{span}$Let $(e_k)$ be an orthonormal sequence in a Hilbert space $H$, and let $M = \span (e_k)$. Let $x \in H$.
If $$x = \sum_{k=1}^\infty \langle x, e_k \rangle e_k,$$ then $x \in \overline{\span(e_k)}$ because in this case the sequence $(s_n)$ in $\span(e_k)$, where $s_n = \sum_{k=1}^n \langle x, e_k \rangle e_k$, converges to $x$.
How to show the converse?
That is, how to show that if $x \in \overline{\span(e_k)}$, then the series $\sum_{k=1}^\infty \langle x, e_k \rangle e_k$ converges (in the norm induced by the inner product on $H$) and has sum $x$?
My effort:
Suppose $x \in \overline{\span(e_k)}$. Then there is a sequence $(x_n)$ in $\span(e_k)$ that converges to $x$. Let $x_n = \sum_{k=1}^{m_n} \alpha_{nk} e_k$ for each $n= 1, 2, 3, \ldots$, where $\alpha_{nk}$ are scalars and the $m_n$ are natural numbers.
Then, using the orthonormality of the $e_k$, we can conclude that $\alpha_{nk} = \langle x_n, e_k \rangle$ for each $n=1, 2, 3, \ldots$ and for each $k= 1, \ldots, m_n$. So $$x_n = \sum_{k=1}^{m_n} \langle x_n, e_k \rangle e_k. $$
What next?
Can we say the following?
For each fixed $k$, $$\langle x_n, e_k \rangle \to \langle x, e_k \rangle \ \mbox{ as } \ n \to \infty. $$
How to show that $$x = \sum_{k=1}^\infty \langle x, e_k \rangle e_k?$$
I also know that the series $\sum \langle x, e_k \rangle e_k$ does converge.
Solution 1:
The orthogonal projection $P_{N}x$ of $x$ onto the subspace $M_{N}$ spanned by $\{ e_1,e_2,\cdots,e_N\}$ is given by $P_{N}x=\sum_{n=1}^{N}(x,e_n)e_n$. The orthogonal projection onto $M_{N}$ is the same as the closest point projection onto $M_{N}$ (just like in the good 'ole days of your Calculus class.) Therefore $$ \|x-P_{N}x\| \le \|x-(\alpha_1 e_1 + \cdots +\alpha_N e_N)\| $$ holds for all choices of scalars $\{\alpha_n\}_{n=1}^{N}$. The orthogonal (equivalently, closest-point) projection onto a larger subspace is at least as close. Hence, $$ \|x-P_{N'}x\| \le \|x-P_{N}x\| \le \|x-(\alpha_1 e_1 + \cdots +\alpha_N e_N)\|,\;\;\; N' \ge N. $$ Therefore, if you can approximate $x$ to within a distance of $\epsilon$ by some $m \in M$, then the orthogonal series is within $\epsilon$ of $x$ for large enough $N$.
Solution 2:
Hint: note that, with the norm defined via the inner product, we have $$ \left\| \sum_{k=1}^N \langle x,e_k \rangle e_k \right\|^2 = \sum_{k=1}^N |\langle x,e_k \rangle|^2 $$ because the vectors $e_k$ are orthonormal. Also, note that for all $N$, $$ \|x\|^2 \geq \left\| \sum_{k=1}^N \langle x, e_k \rangle e_k \right\|^2 $$ We now know that the sum $\sum_{k=1}^\infty |\langle x,e_k \rangle|^2$ converges, which means that $\sum_{k=1}^N \langle x,e_k \rangle e_k$ converges. However, we must now show that its limit is $x$. In order to do this, it suffices to show that $x - \sum_{k=1}^\infty \langle x,e_k \rangle e_k$ is orthogonal to each $e_k$.