Let $\lambda_1,\dots,\lambda_n$ be the eigenvalues of the square and symmetric $n\times n$ real matrix $A$. Consider a rank-one perturbation $B=A+uu^T$, where $u$ is a real $n$-vector.

Is there an analytical expression for the eigenvalues of $B$, exploiting the known eigenvalues of $A$? I'm thinking something similar to the Sherman-Woodbury formula for the update of the inverse could work, but I have not succeeded.

I found some related results: https://en.wikipedia.org/wiki/Bunch%E2%80%93Nielsen%E2%80%93Sorensen_formula, and also https://doi.org/10.1137%2FS089547989223924X. But both seem to require that $u$ is an eigenvector of $A$. Here I am considering a more general statement, where $u$ need not be an eigenvector of $A$>


Solution 1:

I doubt there is an analytic expression for all but the smallest matrix dimensions. However, there is an inexpensive way to compute the eigenvalues of rank-one update. The following comes from Demmel's Applied Numerical Linear Algebra subsection 5.3.3.

For any symmetric matrix $\mathbf{A}$, we can use the eigendecomposition $\mathbf{A} = \mathbf{V}^\top \text{diag}(\mathbf{d})\mathbf{V}$ to determine the eigenvalues of $\mathbf{B} = \mathbf{A} + \rho \mathbf{u}\mathbf{u}^\top$ as equivalently the eigenvalues of a diagonal plus rank-one matrix $\mathbf{V}\mathbf{B}\mathbf{V}^\top = \text{diag}(\mathbf{d}) + \rho (\mathbf{V}\mathbf{u})(\mathbf{V}\mathbf{u})^\top$. We can then compute the eigenvalues through the roots of the determinant $\det(\mathbf{B}- \lambda \mathbf{I})$; in particular, if $\lambda$ is not an eigenvalue of $\mathbf{A}$ we have $$ \det(\mathbf{B} - \lambda \mathbf{I}) = \det(\mathbf{A} + \rho \mathbf{u}\mathbf{u}^\top -\lambda \mathbf{I}) = \det(\mathbf{A} - \lambda \mathbf{I})\left[1+ \rho \sum_i \frac{(\mathbf{e}_i^\top \mathbf{V}\mathbf{u})^2 }{d_{i} -\lambda} \right] $$ where $d_{i}$ are the original eigenvalues of $\mathbf{A}$ and $\mathbf{e}_i$ is the $i$th column of the identity matrix (cf. eq. (5.14)). This bracketed term is called the secular equation and its roots are the eigenvalues of $\mathbf{B}$. Finding these roots can be challenging numerically, but a robust algorithm is implemented in LAPACK in xLAED4.

Solution 2:

This isn't a complete answer, but maybe can be a starting point.

A symmetric matrix has eigen-decomposition $A = U\Lambda U^T = \sum_i \lambda_i u_iu_i^T$.

Essentially we can view this as the stretching of a sphere along the directions $u_i$. To compute $Ax$ we can compute the projection of $x$ is on each of the $u_i$, scale these projections by $\lambda_i$, and add them back together.

Now, when we compute $Bx = Ax + uu^Tx$ we additionally compute the projection of $x$ onto $u$ and add this in with the previous projections. If $u$ is in the direction of one of the $u_i$ that means that direction alone will be scaled by an additional amount. More precisely, if $u = c u_i$ for some $j$, then $uu^T = c^2 u_iu_i^T$ so $$B = A+uu^T = (\lambda_j + c^2)u_iu_i^T + \sum_{i\neq j} \lambda_i u_i u_i^T$$ That is, the eigen value in the direction $u_i$ changes by $c^2$.

Now let's consider what happens if $u$ is in the span of $u_i$ and $u_j$. First, note that the action of $A$ on everything outside of this span is the same as that of $B$. So none of the eigenvalues in the directions other than $u_i$ or $u_j$ will change. More generally, only the eigenvectors who are not orthogonal to $u$ will change.

Consider $u = c_1u_1 + c_2u_2$. Then $uu^T = c_1^2 u_1u_1^T + c_2^2u_2u_2^T + c_1c_2 u_1u_2^T + c_1c_2u_2u_1^T$. So now we see that while the update does contribute to the directions $u_1$ and $u_2$, it also "skews" the original image.

If we viewed the image in this plane as an ellipse, we now get a new ellipse. I'm guessing there is a way to write down the new axes of this ellipse which is the 2d version of your question.

Unfortunately, I'm not exactly sure how to do this.