Showing when a permutation matrix is diagonizable over $\mathbb R$ and over $\mathbb C$

Solution 1:

This adresses the question of diagonalisability over $\Bbb R$. Over $\Bbb C$, there is no problem, since $A_\sigma^m=I_n$ where $m$ is the order of $\sigma$ in the group $S_n$, and thus $A_\sigma$ is diagonalizable over $\mathbb C$.


Use the cycle decomposition of $\sigma$ : if $\sigma=c_1\cdots c_p$ where the $c_i$ are cycles with disjoint support, then you can rearrange the standard basis $(e_1,\dots,e_n)$ so that the supports of the $c_i$ are "together and in order". This shows that $A_\sigma$ is conjugate to a matrix like so $$\begin{pmatrix}C_1\\&C_2\\ &&\ddots\\&&& C_p\end{pmatrix}$$ where each $C_i$ is the matrix $$C_i\;=\begin{pmatrix} 0&&&&&1\\ 1&0\\ &1\\ &&&\ddots&\\ \\ &&&&1&0\end{pmatrix}_{l_i\times l_i}$$ where $l_i$ is the length of the cycle $c_i$. This matrix has minimal polynomial $X^{l_i}-1$. Thus, the minimal polynomial of $A_\sigma$ is $$\mu_{A_\sigma}=\mathrm{lcm}\,(X^{l_1}-1,X^{l_2}-1,\dots,X^{l_p}-1)$$ Since $X^n-1$ has non real roots for all $n\geq 3$, $A_\sigma$ is diagonalizable over $\Bbb R$ iff $\sigma$ is a (possibly empty) product of disjoint transpositions, i.e. $$A_\sigma \text{ is diagonalizable over } \Bbb R \Longleftrightarrow\sigma^2=\mathrm{id}$$

Solution 2:

It is not hard to check that $A_{\sigma}$ has finite order for each $\sigma \in S_n$, where $S_n$ denotes the group of permutations of the set $\{1,\cdots,n\}$. This is because it is not hard to check that $A_{\sigma}^m = A_{\sigma^m}$, where $\sigma^m$ denotes the $m$-fold composition of $\sigma$ with itself. I don't know how much you are aware of representation theory, but it is a general result of the theory of representation of finite groups that given a (complex) representation $\rho : G \to \mathrm{GL}(V)$, for each $\sigma \in G$, $\rho(\sigma)$ is diagonalizable. I will sketch the proof below, in case you don't know about this. (I will not use the language of representation theory, only complex linear algebra.)

Let $\{e_1,\cdots,e_n\}$ be the $\mathbb C$-basis of $\mathbb C^n$ over which you write $A_{\sigma}$ as normally, i.e. so that $A_{\sigma}(e_i) = e_{\sigma(i)}$. Let $\langle - ,-\rangle$ denote the standard Hermitian inner product on $\mathbb C^n$, and define a new Hermitian inner product $(-,-)$ on $\mathbb C^n$ via $$ (x,y) \overset{def}= \frac 1{n!} \sum_{\sigma \in S_n} \langle A_{\sigma} x, A_{\sigma}y \rangle. $$ It is an easy check that $(-,-)$ satisfies the axiom of an Hermitian inner product (see here). Now since it is an Hermitian inner product, there exists an orthogonal basis $\{v_1,\cdots,v_n\}$ with respect to $(-,-)$. Then under this basis, for $\pi \in S_n$, we have $$ (A_{\pi}v_i, A_{\pi}v_j) = \frac 1{n!} \sum_{\sigma \in S_n} \langle A_{\sigma}(A_{\pi} v_i), A_{\sigma}(A_{\pi}v_j) \rangle = \frac 1{n!} \sum_{\sigma \in S_n} \langle A_{\sigma} v_i, A_{\sigma}v_j \rangle = (v_i, v_j) = \delta_{ij} $$ because summing over all permutations of the form $\sigma \pi$ is the same as summing over all permutations of the form $\sigma$. It follows from this that the vectors $\{A_{\pi}v_1,\cdots,A_{\pi}v_n\}$ are pairwise orthogonal under $(-,-)$, so that the matrix of $A_{\pi}$ over the basis $\{v_1,\cdots,v_n\}$ is unitary with respect to $(-,-)$, thus diagonalizable over $\mathbb C$.

Remark : You can also check that since $A_{\pi}^{n!} = A_{\pi^{n!}} = I$ the identity matrix, all the eigenvalues $\lambda$ of $A_{\pi}$ will satisfy $\lambda^{n!} = 1$, i.e. they are $n!$-roots of unity. (The number $n!$ is just to be safe ; for most eigenvalues the number will be actually much smaller, and $n!$ is actually never attained since $S_n$ is not a cyclic group.) I don't know much about representation theory over the real numbers, but there is probably a well-understood theory of this. I would be surprised that someone into real representation theory would not have characterized this already.

Hope that helps,

Solution 3:

Hint. Making a permutation on the column of a matrix $A$ amounts to right multiply $A$ by a permutation matrix $P$, which is invertible. Do you see why? To which matrix would you apply $P$ to obtain $A_{\sigma}$?

Solution 4:

Terminology : A matrix $M$ is said to be diagonalizable if thre exists an invertible matrix $P$ and a diagonal matrix $D$ such that $M=P^{-1}DP$

Spectral Theorem : A Matrix $M$ is Unitarily diagonalizable iff it is normal i.e., $MM^*=M^*M$

Not an answer but an experiment. It's too big to put in the comments area and that's why I'm posting it here

Case n = 2 : Both are symmetric. So they are diagonalizable. \begin{bmatrix} 1 &0 \\ 0 &1 \end{bmatrix} \begin{bmatrix} 0 &1 \\ 1 &0 \end{bmatrix} Case n = 3 : The first four are symmetric. So, they are diagonalizable . I'm not sure about the remaining two. \begin{bmatrix} 1 &0 &0 \\ 0 &1 &0\\ 0 &0 &1 \end{bmatrix} \begin{bmatrix} 1 &0 &0 \\ 0 &0 &1\\ 0 &1 &0 \end{bmatrix} \begin{bmatrix} 0 &1 &0 \\ 1 &0 &0\\ 0 &0 &1 \end{bmatrix} \begin{bmatrix} 0 &0 &1 \\ 0 &1 &0\\ 1 &0 &0 \end{bmatrix} \begin{bmatrix} 0 &0 &1 \\ 1 &0 &0\\ 0 &1 &0 \end{bmatrix} \begin{bmatrix} 0 &1 &0 \\ 0 &0 &1\\ 1 &0 &0 \end{bmatrix}