Is there an operation on matrices such that the determinant yields a homomorphism with the additive group of the reals?

It well known that, under standard matrix multiplication $\det(AB) = \det(A)\det(B)$, or in other words, that $\det : \mathbb{R}^{n \times n} \rightarrow \langle\mathbb{R}, * \rangle$ is a monoid homomorphism.

In a similar vein, given any two matrices $A,B \in \mathbb{R}^{n \times n}$ , is there an operation $A\star B$ such that $\det(A \star B) = \det(A) + \det(B)$?


Solution 1:

Let $A\star B$ be the diagonal matrix whose diagonal entries are all equal to 1 except the topmost leftmost one, which is equal to $\det A+\det B$.

Solution 2:

If $A,B$ are invertible, decompose them as $\det(A)^{1/n} \bar A$ where $\bar A$ has determinant $1$ and define $A \star B = (\det A + \det B)^{1/n}\bar A \bar B$. For $A$ singular, $B$ invertible define $A \star B = B \star A = B.$ When both are singular define $A \star B = 0$.

It ain't pretty. It probably isn't even associative.

Actually, this is overcomplicated - just define $A \star B = {\rm diag}( x, 1, 1, \ldots, 1)$ where $x=\det A + \det B$. This is associative, but it's trivializing the question: you can write this operation as $+ \circ d$ where $d(A) = {\rm diag}(\det A, 1 \ldots).$ Thus what this is essentially doing is taking the determinant first and identifying this one-dimensional space of matrices with the reals, so that the homomorphism we desire is really just the identity map on $\mathbb R$.

Is there a meaningful operation with this property? I doubt it.

Solution 3:

This is actually impossible for $n=0$, since there the determinant is always equal to $1$, and $1+1\neq1$. (Recall that the determinant of an identity matrix is always $1$; this is true even for the $0\times0$ matrix.) You might call this an edge case, but failure in such a simple case might be a warning that you are on a wrong track.