A characterization of 'orthogonal' matrices

(All matrices in this post are assumed to be real square matrices).

Recently I answered a question on this site where one was asked to show that if $A$ is a symmetric matrix, then $U^{-1}AU$ is symmetric if $U$ is orthogonal. I wondered whether the converse was true as well. Concretely:

Suppose that $U$ is an invertible matrix such that for any symmetric matrix $A$, we have that $U^{-1}AU$ is symmetric. Can we conclude that $U$ is orthogonal?

So I tried to prove this and quickly realized this is almost true, indeed one can conclude that $UU^t=\lambda Id$ for some real number $\lambda$. (This is what I would call an orthogonal matrix, an orthonormal matrix is one such that $UU^t=Id$). I'll sketch my proof below:

Let $A$ be symmetric and assume $U^{-1}AU$ is symmetric. Then $(UU^t)A=A(UU^t)$. For any symmetric matrix, the equality $(UU^t)A=A(UU^t)$ allows us to find some conditions on $(UU^t)$. So I choose the easiest basis of the symmetric matrices I could think of and plugged it in. After some calculations you're able to conclude $(UU^t)_{i,j}=0$ if $i\neq j$ and $(UU^t)_{i,i}=(UU^t)_{j,j}$ for all $i,j$. This is what we needed to show.

There is nothing wrong with the above proof, it's just a brute force method. I was wondering whether anyone has a fun conceptual way of proving this. I'd very much like to see such a proof.

EDIT: In particular, I'm looking for an argument that a student could find at the beginning of a linear algebra course. (Diagonalization and spectral theorem are a bit too advanced for now).


Solution 1:

Since every nonzero vector is an eigenvector of some symmetric matrix with distinct eigenvalues, and commuting diagonalisable matrices are simultaneously diagonalisable, that $UU^t$ commutes with every symmetric matrix $A$ implies that every nonzero vector is an eigenvector of $UU^t$.

As you said, $UU^t$ commutes with every symmetric matrix $A$. In particular, for every nonzero vector $v$, we have $(UU^tv)v^{\,t}=vv^tUU^t=v(UU^tv)^t$. Therefore, either $UU^tv=0$ or $UU^tv$ is a nonzero scalar multiple of $v$. In either case, $v$ is an eigenvector of $UU^t$.

So, the problem statement boils down to a more fundamental and well known one: if a matrix (not necessarily symmetric) has the property that every nonzero vector is its eigenvector, it must be a scalar multiple of the identity matrix.

Solution 2:

A diagonalizable matrix is symmetric iff its eigenspaces are orthogonal. So conjugation by $U$ preserves symmetric matrices iff $U$ preserves orthogonality of arbitrary subspaces of $\mathbb{R}^n$. It is clear that this is equivalent to $U$ preserving orthogonality of individual vectors in $\mathbb{R}^n$.

So, all that remains to be verified is that if $U$ preserves orthogonality of vectors, then it is a multiple of an orthogonal matrix. Now note that $$\langle v+w,v-w\rangle=\langle v,v\rangle-\langle w,w\rangle$$ so two vectors $v$ and $w$ have the same norm iff $v+w$ and $v-w$ are orthogonal. So $U$ preserves when pairs of vectors have the same norm, which implies $U$ multiplies all norms by some constant $\lambda$. Then $\lambda^{-1}U$ preserves norms and hence is orthogonal, so $U$ is a constant multiple of an orthogonal matrix.

[Note that by a similar argument, a complex matrix which preserves Hermitian matrices is a multiple of a unitary matrix. The last part of the argument does not quite work if we are dealing with a complex inner product instead of a real inner product, since the inner product identity used is not true. However, the identity still is true assuming $v$ and $w$ are orthogonal, so that $U$ preserves orthogonal pairs of vectors having the same norm. You can then use this to show $U$ multiplies all norms by a constant by breaking vectors into orthogonal pieces.]