What are mandatory conditions for a family of matrices to commute?

Suppose that there are some matrices. Each matrix in the set must commute with another in the set.

What are the mandatory conditions for this?


Over an algebraically closed field, if a family of matrices commute pairwise then they are simultaneously triangularizable. As Robert Israel pointed out in the comments, the converse is not true.

This generalizes to diagonalizable matrices, i.e. a family of diagonalizable matrices commute pairwise if and only if they are simultaneous diagonalizable. This condition is if and only if.

These are the most common results on commuting matrices.


There is a quirky condition that does this. If every matrix in the set is a polynomial or, indeed, analytic function of some matrix $A$ (that need not be in the set), they all commute, since a power of $A$ commutes with another power of $A.$ Over the reals this includes items such as $e^A.$

I think it unlikely that one can find such an $A$ for every set for which each pair of matrices commute. It's just a cute idea.


I'll try to rephrase the answer by Will Jagy a bit more explicitly, and add some more detail.

Characterising all sets of commuting matrices (other than by the condition that they commute) won't be easy, because given any set of commuting matrices (and there are certainly such sets that are infinite), than any subset of it will also be a set of commuting matrices. So it is more fruitful to ask for maximal commuting sets of matrices, sets for which nothing outside the set commutes with all of them (for one could then add such a matrix).

Now given any set of commuting matrices, we can always take scalar multiples of one of them, or sums of several of them, to get other matrices that commute with all of them; therefore a maximal set of commuting matrices must be a subspace of the vector space of all matrices (i.e., it is closed under linear combinations). Moreover, it must also be closed under products of matrices (the technical term is that it must be a subalgebra of the algebra of all square matrices). Given any one matrix $A$, the smallest subalgebra that contains $A$ is the set of polynomials in $A$, the linear combinations of powers $A^i$, including $A^0=I_n$.

Correction I wrote here earlier that every maximal commutative subalgebra has dimension $n$ and is the set of polynomials in one matrix (I thought that this was implied by the answer by Will Jagy, but it clearly isn't, and I thought that I could give a somewhat complicated argument for it, but I can't). This is not true. Indeed, there is a $5$-dimensional commutative subalgebra of $M_4(K)$ of matrices of the form $$ \begin{pmatrix}x&0&a&b\\0&x&c&d\\0&0&x&0\\0&0&0&x\end{pmatrix} $$ and for dimension reasons it cannot be the set of polynomials of any one matrix.

A question to which I do not know the answer is can two commuting matrices $X,Y$ always be expressed as polynomials of some third matrix $A$? I initially thought, inspired by the above, that $$ X=\begin{pmatrix}0&0&1&0\\0&0&0&1\\0&0&0&0\\0&0&0&0\end{pmatrix},\qquad Y=\begin{pmatrix}0&0&1&0\\0&0&0&-1\\0&0&0&0\\0&0&0&0\end{pmatrix}, $$ provided a negative answer; however it turns out that for $$ A=\begin{pmatrix}1&0&1&0\\0&0&0&1\\0&0&1&0\\0&0&0&0\end{pmatrix} $$ (which is not in the commutative subalgebra above) one has $Y=A(A-I)=A^2-A$ and $X=A(A-I)(2A-I)=2A^3-3A^2+A$.