Does an eigenspace of a matrix depend continuously on its components?

Let $M(x)$ be a diagonalisable $n \times n$ complex matrix whose components are continuous functions of $x$ and suppose that, for all $x$, $M$ has eigenvalue $0$ with multiplicity $m < n$ (independent of $x$). Is it possible to choose a basis for the $0$-eigenspace of $M$ whose components are continuous functions of $x$?


Solution 1:

This is even true without the assumption that M is diagonalizable, the only important point is that $M(x)$ has constant rank. Your claim is a simple instance of a standard result on vector bundles, but I'll describe a direct argument, assuming that $x\in\mathbb R$.

First you have to observe that it suffices to do this locally. Suppose you have given two continuous families of vectors $v_1(x),\dots,v_m(x)$ for $x\in (a,b)$ and $w_1(x),\dots,w_m(x)$ for $x\in (c,d)$ with $a<c<b<d$, each spanning the kernel of $M(x)$ wherever they are defined. Then fix $x_0$ with $c<x_0<d$, so that both families are defined in $x_0$. Then there is an invertible $m\times m$--matrix $(a_{ij})$ such that $v_i(x_0)=\sum_ja_{ij}w_j(x_0)$. Then define $u_i(x)$ for to be $v_i(x)$ for $x\leq x_0$ and $\sum_ja_{ij}w_j(x)$ for $x\geq x_0$. By construction, each $u_i(x)$ is defined and continuous on $(a,d)$ an $u_1(x),\dots,u_m(x)$ form a basis for the kernel of $M(x)$ for each $x\in(a,d)$.

To solve the problem locally, fix a point $x_0$, and choose a basis $v_1,\dots,v_m$ for the kernel of $M(x_0)$ and extend it by $v_{m+1},\dots,v_n$ to a basis for $\mathbb R^n$. Then defining $w_i:=M(x_0)v_i$ for $i=m+1,\dots,n$, the vectors $w_{m+1},\dots,w_n$ are linearly independent in $\mathbb R^n$, so we can extend them to a basis $\{w_1,\dots,w_n\}$ of $\mathbb R^n$. Now let $A$ be the matrix with columns $v_1\dots,v_n$ and $B$ be the matrix with columns $w_1,\dots,w_m$. Then $\tilde M(x):=BM(x)A^{-1}$ expresses the family $M(x)$ of linear transformations with respect to the bases $\{v_i\}$ and $\{w_i\}$, and of course $\tilde M(x)$ still has entries depending Continuously on $x$. If we have a continuous family $u_1(x),\dots, u_m(x)$ spanning the kernel of $\tilde M(x)$, then $Au_1(x),\dots, Au_m(x)$ solves the problem for $M(x)$. Hence we only have to deal with $\tilde M(x)$. Write this as a block matrix $\begin{pmatrix} A(x) & B(x) \\ C(x) & D(x) \end{pmatrix}$ with blocks of size $m$ and $n-m$. Then by construction $A(x_0)=B(x_0)=C(x_0)=0$ while $D(x_0)$ is the identity matrix of size $n-m$. By continuity, there is an $\epsilon>0$ such that $\det(D(x))\neq 0$ for $|x-x_0|<\epsilon$. Moreover, by Cramer's rule, the matrix entries of $D(x)^{-1}$ depend continuously on $x$. Denoting by $I$ the identity matrix and compute the product of $\tilde M(x)$ with the invertible block matrix $\begin{pmatrix} I & 0 \\ D(x)^{-1}C(x) & D(x)^{-1}\end{pmatrix}$, whose entries depend continuously on $x$. The result is $\begin{pmatrix} A(x)-B(x)D(x)^{-1}C(x) & B(x)D(x)^{-1} \\ 0 & I \end{pmatrix}$. By assumption, $\tilde M(x)$ has rank $m-n$ for all $x$, so the same must be true for the latter matrix. Since its last $m-n$ rows are evidently linearly independent, this is only possible, if all rows are linear combinations of the last $m-n$ rows. But this shows that $ A(x)-B(x)D(x)^{-1}C(x)=0$ for all $x$, and hence the first $m$ columns of $\begin{pmatrix} I & 0 \\ D(x)^{-1}C(x) & D(x)^{-1}\end{pmatrix}$ form the required basis for the kernel of $\tilde M(x)$.