Determinant of rank-one perturbation of a diagonal matrix

Let $A$ be a rank-one perturbation of a diagonal matrix, i. e. $A = D + s^T s$, where $D = \DeclareMathOperator{diag}{diag} \diag\{\lambda_1,\ldots,\lambda_n\}$, $s = [s_1,\ldots,s_n] \neq 0$. Is there a way to easily compute its determinant?

One the one hand, $s^Ts$ has rank one so that it has only one non-zero eigenvalue which is equal to its trace $|s|^2 = s_1^2+\cdots+s_n^2$. On the other hand, if $D$ was a scalar operator (i.e. all $\lambda_i$'s were equal) then all eigenvalues of $A$ would be shifts of the eigenvalues of $s^T s$ by $\lambda$. Thus one eigenvalue would be equal to $\lambda+|s|^2$ and the others to $\lambda$. Hence in this case we would obtain $\det A = \lambda^{n-1} (\lambda+|s|^2)$. But is it possible to generalize these considerations to the case of diagonal non-scalar $D$?


You do not need the diagonal entries of $D$ to be positive. Since $\det(I+AB)=\det(I+BA)$ we have $\det(I+xy^T)=1+y^Tx$ (which is not hard to prove directly). If $D$ is invertible, then $$ \det(D+ss^T) = \det(D(I+D^{-1}ss^T)) = \det(D) \det(I+D^{-1}ss^T) = (1+s^TD^{-1}s) \det(D). $$ This gives $$ \det(D+ss^T) = \det(D) \left(1+\sum_r \frac{s_r^2}{d_r}\right) $$ If two diagonal entries of $D$ are zero then $\det(D+ss^T)=0$. If just one is zero, $D_{n,n}$ say, then $$ \det(D+ss^T) = s_n^2\prod_{r=1}^{n-1} D_{r,r}. $$ If you define $\delta_r=\det(D)/D_{r,r}$ we have $$ \det(D+ss^T) = \sum_r \delta_r s_r^2, $$ which is neater and holds in all cases.


As developed in the comments, for positive diagonal entries:

$$\det(D + s^Ts) = \prod\limits_{i=1}^n \lambda_i + \sum_{i=1}^n s_i^2 \prod\limits_{j\neq i} \lambda_j $$

It's general application can be deduced by extension from the positive cone of $\mathbb{R}^n$ by analytic continuation. Alternatively we can advance a slightly modified argument for all nonzero diagonal entries. The determinant is a polynomial in the $\lambda_i$'s, so proving the formula for nonzero $\lambda_i$'s enables us to prove it for all $D$ by a brief continuity argument.

First assume all $\lambda_i \neq 0$, and define vector $v$ by $v_i = s_i/\lambda_i$. Similar to the OP's observations:

$$ \det(D+s^Ts) = \det(I+s^Tv)\det(D) = (1 + \sum\limits_{i=1}^n s_i^2/\lambda_i) \prod\limits_{i=1}^n \lambda_i $$

where $\det(I+s^Tv)$ is the product of $(1 + \mu_i)$ over all the eigenvalues $\mu_i$ of $s^Tv$. As the OP noted, at most one of these eigenvalues is nonzero, so the product equals $1$ plus the trace of $s^T v$, i.e. the potentially nonzero eigenvalue, and that trace is the sum of entries $s_i^2/\lambda_i$.

Distributing the product of the $\lambda_i$'s over that sum gives the result at top. If some of the $\lambda_i$'s are zero, the formula can be justified by taking a sequence of perturbed nonzero $\lambda_i$'s whose limit is the required $n$-tuple. By continuity of the polynomial the formula holds for all diagonal $D$.