Determinant of rank-one perturbations of (invertible) matrices

I read something that suggests that if $I$ is the $n$-by-$n$ identity matrix, $v$ is an $n$-dimensional real column vector with $\|v\| = 1$ (standard Euclidean norm), and $t > 0$, then

$$\det(I + t v v^T) = 1 + t$$

Can anyone prove this or provide a reference?

More generally, is there also an (easy) formula for calculating $\det(A + wv^T)$ for $v,w \in \mathbb{K}^{d \times 1}$ and some (invertible) matrix $A \in \Bbb{K}^{d \times d}$?


Rank one update, reference Matrix Analysis and Aplied Linear Algebra, Carl D. Meyer, page 475:

If $A_{n \times n} $ is nonsingular, and if $\mathbf{c}$ and $\mathbf{d} $ are $n \times 1$ columns, then \begin{equation} \det(\mathbf{I} + \mathbf{c}\mathbf{d}^T) = 1 + \mathbf{d}^T\mathbf{c} \tag{6.2.2} \end{equation} \begin{equation} \det(A + \mathbf{c}\mathbf{d}^T) = \det(A)(1 + \mathbf{d}^T A^{-1}\mathbf{c}) \tag{6.2.3} \end{equation}

So in your case, $A=\mathbf{I}$ and the determinant is $1(1+ t\mathbf{v}^T\mathbf{v})=1+t$

EDIT. Further from the text:

Proof. The proof of (6.2.2) [the previous] follows by applying the product rules (p. 467) to \begin{equation} \pmatrix{\mathbf{I} & \mathbf{0} \\ \mathbf{d}^T & 1}\pmatrix{\mathbf{I} + \mathbf{c}\mathbf{d}^T& \mathbf{c} \\ \mathbf{0} & 1}\pmatrix{\mathbf{I} & \mathbf{0} \\ -\mathbf{d}^T & 1}=\pmatrix{\mathbf{I} & \mathbf{c} \\ \mathbf{0} & 1 + \mathbf{d}^T\mathbf{c}} \end{equation}

To prove (6.2.3) write $A + \mathbf{c}\mathbf{d}^T = A ( \mathbf{I} + A^{-1}\mathbf{c}\mathbf{d}^T)$, and apply the product rule (6.1.15) along with (6.2.2)


I solved it. The determinant of $I+tvv^T$ is the product of its eigenvalues. $v$ is an eigenvector with eigenvalue $1+t$. $I+tvv^T$ is real and symmetric, so it has a basis of real mutually orthogonal eigenvectors, one of which is $v$. If $w$ is another one, then $(I+tvv^T)w=w$, so all the other eigenvalues are $1$.

I feel like I should have known this already. Can anyone provide a reference for this and similar facts?