If a symmetric matrix commutes with all symmetric matrices, is it then a multiple of the identity?

Let $E_{ij}$ denote the matrix with a $1$ in the $(i,j)$-th place and $0$'s everywhere else. Then $E_{ii}$ is symmetric, and so from $$E_{ii}A=AE_{ii},$$ it follows that the $i$-th row and column are all zeros except in the $i$-th place. This shows that $A$ is diagonal.

Also $E_{ij}+E_{ji}$ is diagonal, and so from $$(E_{ij}+E_{ji})A=A(E_{ij}+E_{ji}),$$ it follows that the $i$-th and $j$-th diagonal entries of $A$ are the same. This shows that $A=cI$ for some constant $c$.

Note that none of this assumes that the matrices are real; this hold for matrices over any commutative ring.


Let $V_1, V_2$ be eigenvectors of $A$ with distinct eigenvalues $\lambda_1, \lambda_2$. Let $C$ be an arbitrary symmetric matrix. We have:

$$\langle V_1 |CA|V_2\rangle = \lambda_2 \langle V_1 |C|V_2\rangle\\ \langle V_1 |CA|V_2\rangle = \langle V_1 |AC|V_2\rangle = \lambda_1 \langle V_1 |C|V_2\rangle $$ So we have $\lambda_2 \langle V_1 |C|V_2\rangle = \lambda_1 \langle V_1 |C|V_2\rangle $ for any symmetric matrix $C$. Setting $C = | V_1 + V_2\rangle \;\langle V_1 + V_2 |$ and knowing that $\langle V_1 | V_2 \rangle = 0$ (as eigenvectors of distinct eigenvalue for symmetric matrices) we obtain $\lambda_1 = \lambda_2$. This is a contradiction, so $A$ does not have distinct eigenvalues. Therefore $A = \lambda I.$

Edit: we can make the proof stronger by dropping the symmetry assumption for $A$. Dropping the symmetry assumption would make the proof harder with eigenvectors of $A$. If we start with $V_1,V_2$ as eigenvectors of $C$ instead of $A$, and do the same algebraic manipulations, we obtain:

$$\lambda_2 \langle V_1 |A|V_2\rangle = \lambda_1 \langle V_1 |A|V_2\rangle$$ where $V_1, V_2$ are distinct-eigenvalued eigenvectors of $C$. As $C$ is a freely chosen symmetric operator, $\langle V_1| A |V_2 \rangle = 0$ for all $V_1,V_2$ with $\langle V_1, V_2 \rangle = 0$. We have $ A V_2$ is orthogonal to the orthogonal complement of $V_2$, so $A V_2$ is proportional to $V_2$, for all vectors $V_2$. Thus $A= \lambda I$.


Here's a different proof that has more machinery but may be interesting in terms of technique. A repeatedly useful idea for these types of questions is to make use of the basics of representation theory of finite groups, in particular leaning on permutation groups and Schur's Lemma.

What follows below works for $\mathbb R$ and its extension $\mathbb C$. (In the complex case the proof runs essentially the same whether we consider $A$ to be hermitian or symmetric.)

radically streamlined proof
to prove this claim for arbitrary symmetric $n\times n$ matrix $A$, first consider the standard matrix representation of $S_{n+1}$, and call this matrix group $M^{(S_{n+1})}$. This group is a direct sum of the trivial representation and an irreducible $n$ dimensional representation.

The generators of $M^{(S_{n+1})}$ are $n+1\times n+1$ elementary type 2, matrices $E_2^{(k)}$, each of which is symmetric. Now consider orthogonal $U \in \mathbb R^{n+1\times n+1}$ where $\mathbf u_1 \propto \mathbf 1$. Thus for $P \in M^{(S_{n+1})}$,

$P=\prod_k E_2^{(k)}$ and $P' = U^TPU= U^T\big(\prod_k E_2^{(k)}\big)U = \prod_k \big(U^TE_2^{(k)}U\big)$
each $\big(U^TE_2^{(k)}U\big)$ is necessarily symmetric and of the form
$\begin{bmatrix}1 & \mathbf 0\\ \mathbf 0 & Y_{n} \end{bmatrix}$
where $Y_n$ is the aforementioned n-dim irreducible representation and $Y_n$ is thus symmetric.

Hence the $n$ dimensional irreducible matrix rep for $S_{n+1}$ is generated by a product of symmetric matrices ($Y_n$). And $A$ commutes with each n-dimensional generator, hence $A$ commutes with the $n$ dimensional irreducible matrix representation. Temporarily working over $\mathbb C$, we apply Schur's Lemma $\implies A=\lambda I_n$ where $\lambda \in \mathbb R$ since $A$ is real.

original, longer proof
0.) $2\times 2$ Case
By direct calculation you can show the result (i.e. $A$ commutes with a diagonal matrix with distinct elements on diagonal so $A$ must be diagonal and then it commutes with some non-diagonal matrix that is symmetric so $A\propto I$).

An immediate corollary for the $n\times n$ case is
in the special case that $A = \begin{bmatrix}* & \mathbf 0\\ \mathbf 0 & \lambda I_{n-1} \end{bmatrix}$ and commutes with all symmetric matrices $\implies A=\lambda I_n$

1.) $n\times n$ Case
A commutes with all symmetric matrices, hence it commutes with all elementary type 2 matrices $E_2^{(j)}$. These generate the standard matrix representation of permutation group $S_n$, which is a finite group. I'll call this matrix group $M^{(S_n)}$. Since $A$ commutes with the generators, $A$ commutes with $M^{(S_n)}$.

Each $P \in M^{(S_n)}$ is a direct sum of the trivial representation and an irreducible $n-1$ dim representation (this is e.g. an exercise in Artin's Algebra's rep theory chapter, either edition).

Finally consider real orthogonal $Q$ where $\mathbf q_1 = \frac{1}{\sqrt{n}}\mathbf 1$ (i.e $\propto$ the ones vector) and for arbitrary $P \in M^{(S_n)}$ define
$P':= Q^T PQ=\begin{bmatrix}1 & \mathbf 0\\ \mathbf 0 & B_{n-1} \end{bmatrix}$
where $B_{n-1}$ is the irreducible $n-1$ dimensional representation mentioned above.

We observe $P' = Q^T\big(\prod_{j} E_2^{(j)}\big)Q =\prod_{j} \big(Q^T E_2^{(j)}Q\big)$ each of which is symmetric so $A$ commutes with $P'$. There are various ways to finish.

E.g. using the $P_c' = Q^T P_cQ$, where $P_c$ is the cyclic shift operator (i.e. Companion matrix for $x^n-1$) we have $\big(P_c'-I_n\big)A\mathbf e_1 = A\big(P_c'-I_n\big)\mathbf e_1 = \mathbf 0\implies A\mathbf e_1 \propto \mathbf e_1$ because $A\mathbf e_1\in \ker \big(P_c'-I_n\big) = \big\{\alpha\cdot \mathbf e_1\big\}$. Thus

$A = \begin{bmatrix}* & *\\ \mathbf 0 & Z_{n-1} \end{bmatrix}=\begin{bmatrix}* & \mathbf 0\\ \mathbf 0 & Z_{n-1} \end{bmatrix}$
since $A$ is symmetric.

But $Z_{n-1}$ commutes with arbitrary $B_{n-1} $ which is an irreducible representation (and temporarily working over the extension field $\mathbb C$) hence Schur's Lemma tells us that $Z_{n-1} = \lambda I_{n-1}$ and application of the above corrollary tells us $A=\lambda I_n$