Convergence of spectra under strong convergence of operators

FWIW: The best result that comes close to what you seek, that I know, is theorem 50.16 in

  • Kriegl, Michor: "The Convenient Setting of Global Analysis",

which is an extension of a theorem of Rellich that you can also find in

  • Kato: "Perturbation Theory of Linear Operators", chapter 7, theorem 3.9

It says: For a smooth curve of unbounded self-adjoint operators in a Hilbert space $t \to A$, with common domain of definition and compact resolvent, the eigenvalues of $A(t)$ may be arranged increasingly ordered in such a way that they become $C^1-$ functions. If the curve is real analytic, then the eigenvalues and eigenvectors can be chosen smoothly in t.

A smooth curve of unbounded operators means that $t \to (A(t)u, v)$ is smooth for all $u, v \in H$ vectors in the Hilbert space, and $u$ in the domain of definition of $A(t)$, of course.

On the other hand, there is a theorem that approaches the problem from a different angle in

  • Dunford, Schwartz: "Linear Operators, Part II"

chapter X.7 "Perturbation Theory", corollary 3: For $E_n, E$ being the resolutions of the identity of the normal operators $T_n, T$ with $T_n \to T$ in the strong operator topology, we have: If $E$ vanishes on the boundary of the Borel set $\sigma$, then $E_n(\sigma) \to E(\sigma)$ in the strong operator topology.

I haven't thought if it is possible to use this result to get closer to an answer to your question, though :-)

HTH.


Here is another partial answer. From the book by Kato, "Perturbation theory for linear operators", this is Theorem 4.10 in Chapter 5 (p. 291); I'm paraphrasing a bit:

Let $T$ be selfadjoint and $A$ be selfadjoint and bounded operators in a Hilbert space. Then $$ \operatorname{dist}(\Sigma(T + A), \Sigma(T)) \le \| A \|. $$

Here $\Sigma(T)$ denotes the spectrum of $T$.

In other words, if you replace the strong operator topology by the norm topology, so you have $\| T - T_n \| \le \epsilon$, then you know that for each element $\sigma$ of the spectrum of $T$, there is at least one element $\mu$ of the spectrum of $T_n$ such that $|\sigma - \mu| \le \epsilon$ - and vice versa.