Eigenvalues are continuous?

The answer is yes, and this is dependent on the fact that the roots of a polynomial vary continuously with its coefficients.

We have the following theorem taken from A Brief Introduction to Numerical Analysis by Tyrtyshnikov.

Theorem: Consider a parametrized batch of polynomials $$p(x,t) = x^n + a_1(t)x^{n-1} + \cdots + a_n(t),$$ where each $a_i(t)$ is a continuous function on the interval $[\alpha,\beta]$. Then there exists continuous functions $$x_1(t),\ x_2(t),\ \cdots,\ x_n(t),$$ on $[\alpha, \beta]$ such that for each $x_i(t)$ we have $$p(x_i(t),t) = 0,\ \ \ t\in[\alpha,\beta].$$ $\square$

With $C(t)=A+tB$, each entry of the matrix is a linear polynomial in $t$ and hence the characteristic polynomial will be parametrized in the form above with $t\in[0,1]$. The theorem then directly implies that the roots of the characteristic polynomial, i.e. the eigenvalues of the matrix $C(t)$, are expressible as continuous functions of $t$.


If $t\mapsto A(t)$ is a continuous function from an interval $I$ of the real line in $\mathbb{C}^{n\times n}$, then there exist $n$ continuous functions $\lambda_i:I\to \mathbb{C}, i=1,\ldots,n$ such that for each $t\in I$ the spectrum of $A(t)$ is equal to $\{\lambda_1(t),\ldots,\lambda_n(t) \}$. For some $t_0\in I$ and $i\neq j$ we can have $\lambda_i(t_0)=\lambda_j(t_0)$.

One can see proofs of this Theorem in:

[1] T. Kato: A short introdunction to perturbation theory for linear operators, Springer-Verlag, 1982. Pages 126-127, Theorem 5.2.

[2] R. Bhatia: Matrix analysis, Springer, 1997. Pages 154-155, Theorem VI.1.4 and Corollary VI.1.6.