A problem on condition $\det(A+B)=\det(A)+\det(B)$

Let $A$ be a matrix $n\times n$ matrix such that for any matrix $B$ we have $\det(A+B)=\det(A)+\det(B)$. Does this imply that $A=0$? or $\det(A)=0$?


No. Consider the 1-by-1 case, where $\det(A+B)\equiv A+B\equiv\det(A)+\det(B)$.

However, the statement is true for $n>1$. By multiplying elementary matrices, we may assume that $A=I_k\oplus0_{(n-k)\times(n-k)}$, where $k$ is the rank of $A$. If $k=n$, i.e. $A=I$, consider $B=0\oplus -I_{n-1}$. Then $\det(A+B)=\det(A)+\det(B)$ implies that $0=1$, which is a contradiction. Thus $k<n$. Now let $B=0_{k\times k}\oplus I_{n-k}$. Then $\det(A+B)=\det(A)+\det(B)$ implies that $1=\det(B)$. Hence $k$ must be equal to $0$, i.e. $A=0$.


When $n\geq 2$ and $A,B$ are $n\times n$ matrices over a principal ideal domain $R$ (a fortiori any field), this condition implies $A=0$. The argument of @user1551 works the same.

By the Smith normal form decomposition, there exist two invertible matrices $P,Q$ and a diagonal matrix $D$ such that $$ PAQ=D. $$ Then $$ \det (D+PBQ)=\det P \det (A+B)\det Q=\det P(\det A+\det B)\det Q =\det D+\det(PBQ) $$ so $$ \det (D+C)=\det D+\det C \qquad\forall C\in M_n(R). $$ Assume $D$ is nonzero. Without loss of generality, we can assume that $D=\mbox{diag}(d_1,\ldots,d_k,0\ldots,0)$ where $k\geq 1$ and all the $d_j\neq 0$.

If $k=n$, we get a contradiction with $C=\mbox{diag}(-d_1,0\ldots,0)$: $0=\det D+0$ while $\det D\neq 0$.

If $k<n$, we get a contradiction with $C=\mbox{diag}(0,\ldots,0,1\ldots,1)$ where there are $k$ zeros: $d_1\cdots d_k=0+0$.

So $$ D=0\qquad \Rightarrow \quad A=P^{-1}DQ^{-1}=0. $$

Note: if $R$ is not a principal ideal domain, I have no idea.


This happens of course when $n = 1$, as noted by @user1551

If $n \ge 2$, consider the matrix $C$ which has all zeroes, except the last row, which equals the first row of $A$ minus the last row of $A$. Thus $$ 0 = \det(A + C) = \det(A) + \det(C) = \det(A). $$ This is because $A+C$ has the first and last row equal, and $\det(C) = 0$ as $n > 1$.

Then $$ \det(-A + \lambda I) = (-1)^{n} \det(A) + \lambda^{n} = \lambda^{n}, $$ so all eigenvalues of $A$ are zero, and $A$ is nilpotent. Consider without loss of generality the case when $A \ne 0$ consists of a single Jordan block $$ A = \begin{bmatrix} 0&1&0&\dots\\ 0&0&1&\dots\\ \dots\\ \dots&&0&1&0\\ \dots&&&0&1\\ &&&&0 \end{bmatrix}, $$ where I have omitted zero values. Consider $$ B = \begin{bmatrix} 0&0&0\dots\\ 0&0&0\dots\\ \dots\\ \dots&0&0&0\\ \dots&&0&0\\ 1&&&0& \end{bmatrix}. $$ Then $\det(A) = \det(B) = 0$, but $\det(A+B) = 1$.

It follows that $A = 0$.