Can the matrices $A$ and $I+A$ have the same determinant?

The first thing that comes to mind: $n$ even, $A=-I/2$.


Your first question has already many answers. So I'll address your second question.

Edit: in view of Marc van Leeuwen's recommended answer and examples, I will precise how I interpret your question. I read: is it possible that $A$ and $I+A$ have the same complex eigenvalues repeated according to their algebraic multiplicities? I also assume you mean $n\geq 1$. With these assumptions, the answer to your second question is no.

Proof 1: we have $$ \mbox{tr}(A+I)-\mbox{tr} (A) =\mbox{tr}(A)+n-\mbox{tr}(A)=n\geq 1. $$ So $A+I$ and $A$ have distinct traces. In particular, they can't have the same eigenvalues.

Proof 2: let $p_A(X)=\det (XI-A)$ be the characteristic polynomial of $A$. The characteristic polynomial of $I+A$ satisfies: $$ p_{I+A}(X)=\det(XI-I-A)=\det((X-1)I-A)=p_A(X-1). $$ If $A$ and $I+A$ have the same eigenvalues, it means that they have the same characteristic polynomial. This is therefore equivalent to $$ p_A(X)=p_A(X-1). $$ If $p_A(\lambda)=0$, $p_A(\lambda-1)=0$. And by induction $p_A(\lambda -k)=0$ for all integer $k\geq 0$. So $p_A$ is a degree $n\geq 1$ polynomial with infinitely many roots. That's impossible.

Proof 3: much better than proofs 1 and 2, actually. It follows immediately from the definition of the spectrum that $\mbox{Spectrum}(A+I)=\mbox{Spectrum}(A)+1$. Taking the maximum of the real parts of these sets, we get

$$ \max\; \mbox{Re} \;\mbox{Spectrum}(A+I)=\max \;\mbox{Re} \;\mbox{Spectrum}(A)+1. $$ So $A$ and $A+I$ can't have the same spectra, not even mentioning multiplicities.

Note: the last argument works also for the real spectrum as soon as it is nonempty, which happens simultaneously for $A$ and $A+I$. It works also more generally in a Banach algebra. And as pointed out by Marc van Leeuwen, from leonbloy's observation, it can simply be summarized to: there is no nonempty finite subset in $\mathbb{R}$ which is invariant under $\lambda\longmapsto \lambda +1$. You can now replace finite by compact, and $\mathbb{R}$ by $\mathbb{C}$, to get the general Banach algebra case.


For the second question,

$${\bf A p} = \lambda {\bf p} \iff {\bf (A + I)\, p } = (\lambda +1) \, {\bf p}$$

This says that the eigenvalues of $A+I$ are the same eigenvalues of $A$ incremented by 1, hence they cannot be the same. (It can happen, of course, than some eigenvalue of $A+I$ is also an eigenvalue of $A$).

(Update) I assumed here that we are considering the "full" set of eigenvalues (in $\mathbb{C}$). If we restrict to real eigenvalues, the answer also applies; but now, as noted in others answers, the sets can coincide iff they are empty.


$A=\begin{bmatrix} -1 & 0 \\ 0 & 0 \end{bmatrix}$. $A+I = \begin{bmatrix} 0 & 0 \\ 0 & 1 \end{bmatrix}$. $\det A = 0$, $\det (I+A) = 0$.

The requirement is that if $A$ has eigenvalues $\lambda_k$, then $\det A = \prod_k \lambda_k = \prod_k (\lambda_k+1) = \det (A+I)$. This is easy to arrange if we choose a singular matrix with one eigenvalue at $-1$.

To get a non-singular example, suppose $\lambda_2=1$ to simplify, then the other eigenvalue must satisfy $1 \cdot \lambda_1 = (1+1)(\lambda_1 +1)$, which reduces to $\lambda_1 = -2$. Hence $A=\begin{bmatrix} 1 & x \\ 0 & -2 \end{bmatrix}$ will work (for any $x$).


Just to give the mathemapedantically correct answer to the second question: Yes the eigenvalues of $A$ can be the same as those of $A+I$, but only if there aren't any of them. Real square matrices correspond to operators on real vector spaces, and such operators may well have no eigenvectors at all. A typical example arises for $$ A=\begin{pmatrix}0&-1\\1&0\end{pmatrix}.$$ If you view $A$ as a matrix over $\Bbb C$, then a corresponding complex-linear operator will have eigenvalues, namely $\mathbf i$ and $-\mathbf i$, which will differ from those, $1+\mathbf i$ and $1-\mathbf i$, of the linear operator associated to $A+I$. So with this interpretation $A$ is not an example.

However even over the complex numbers an operator might not have any eigenvalues. This happens if (and only if) the space it is defined on is of dimension $0$. So you get a $0\times0$-matrix $A$ in this case, for which even $A=A+I$ holds (here $I$ is also the $0\times0$-matrix; it is both an identity matrix and a zero matrix).

The reason that $A$ and $A+I$ can only have the same eigenvalues if there aren't any is of course (as leonbloy indicates) that immediately from the definition, $\lambda$ is eigenvalue for $A$ if and only if $\lambda+1$ is eigenvalue for $A+I$, and the empty set is the only finite subset of $\Bbb R$ that is invariant under translation by $1$.