Computing the dimension of a vector space of matrices that commute with a given matrix B,

Assume $B$ is in Jordan normal form (real, the complex case is left as exercise). Assume the block $J_i = \lambda_i I + S$ is $n_i \times n_i$ for $1\le i \le p$, where $I$ is the identity and $S$ the shift matrix of suitable size. Then, the space of matrices which commutes with $B$ has dimension $$ \sum_{\substack{i,j=1 \\ \lambda_i = \lambda_j}}^p \min(n_i, n_j). $$

(I hope I found all cases.)

Notes and Proof(outline):

For $k\in\mathbb N$ let $I_k$ denote the $k\times k$ identity matrix and let $S_k$ denote the $k\times k$ shift matrix.

  1. Notice that the operator $C_B(A) = AB - BA$ can be written as matrix vector product using the Kronecker product and the vectorization operator $\operatorname{vec}$: $$ \operatorname{vec} C_B(A) = ((B^T \otimes I) - (I \otimes B)) \operatorname{vec}(A). $$ So, for an explicitly given $B$ you can compute the rank easier.

  2. Also notice that for $B = QJQ^{-1}$, where $Q$ is non-singular, we have $$ \ker C_B = Q\ker C_J Q^{-1}. $$ So we may assume Jordan normal form w.l.o.g.

  3. Let $A\in\mathbb R^{m\times n}$. In case of $m \le n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} 0 & \bar A \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_m, S_m^2, \dotsc, S_m^{m-1} \}. $$ Similarly, in case of $m \ge n$, by induction, we obtain $$ S_m A = A S_n $$ if any only if $$ A = \begin{bmatrix} \bar A \\ 0 \end{bmatrix} $$ for some $$ \bar A \in \operatorname{span}\{ I, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$

  4. For $B = \lambda I_n + S_n\in\mathbb R^{n\times n}$, where $\lambda\in\mathbb R$, we have $$ \ker C_B = \ker C_S = \operatorname{span}\{ I_n, S_n, S_n^2, \dotsc, S_n^{n-1} \}. $$

  5. Let $A = (A_{i,j})_{1\le i,j \le p}$ be a block matrix and $B = \operatorname{diag}(B_1,\dotsc, B_p)$ a block diagonal matrix w.r.t. the same partition and $B_i = \lambda_i I_{n_i} + S_{n_i}$. Then, we have $C_B(A) = 0$ if and only if $$ 0 = A_{i,j} B_j - B_i A_{i,j} = (\lambda_j - \lambda_i) A_{i,j} + (A_{i,j} S_{n_j} - S_{n_i} A_{i,j}) $$ for $1 \le i \le j \le p$. That is, if $A_{i,j}\ne 0$, then $-(\lambda_j - \lambda_i)$ is an eigenvalue of the operator $X\mapsto C_{S_{n_j}, S_{n_i}}(X) := XS_{n_j} - S_{n_i} X$. However, that operator has only the eigenvalue $0$.


There is a theorem by Frobenius:

Let $A\in {\rm M}(n,F)$ with $F$ a field and let $d_1(\lambda),\ldots,d_s(\lambda)$ be the invariant factors $\neq 1$ of $\lambda-A$, let $N_i=\deg d_i(\lambda)$. Then the dimension of the vector space of matrices that commute with $A$ is $$N=\sum_{j=1}^s (2s-2j+1)n_j$$


Alternative approach. The space you are looking at is the kernel of the commutator map at $B$ $$\mathrm{Mat}_{5×5}(ℝ) → \mathrm{Mat}_{5×5}(ℝ),~A ↦ AB - BA,$$ which is $ℝ$-linear. You can easily compute the rank of this map by throwing in a basis of $\mathrm{Mat}_{5×5}(ℝ)$ and see what dimension the space is that their images generate. Then you can use the rank–nullity theorem. So it comes down to computing the dimension of the linear hull of some system of matrices.