Complex matrix that commutes with another complex matrix.

I am trying to learn some linear algebra, and currently I am having a difficulty time grasping some of the concepts. I have this problem I found that I have no idea how to start.

Assume that $\bf A$ is an $n\times n$ complex matrix which has a cyclic vector. Prove that if $\bf B$ is an $n\times n$ complex matrix that commutes with $\bf A$, then ${\bf B}=p({\bf A})$ for some polynomial $p$.

All I know at this point is that ${\bf AB}={\bf BA}$.


Since $A^0v,A^1v,\dots,A^{n-1}v$ are linearly independent, they form a basis for ${\bf C}^n$. Thus, $$Bv=c_0A^0v+c_1A^1v+\cdots+c_{n-1}A^{n-1}v=p(A)v$$ where $$p(x)=c_0+c_1x+\cdots+c_{n-1}x^{n-1}$$ for some constants $c_0,c_1,\dots,c_{n-1}$. Since $B$ commutes with $A$, it commutes with all powers of $A$, so $$B(A^rv)=A^rBv=A^rp(A)v=p(A)(A^rv)$$ for $r=0,1,\dots,n-1$ (I've used $A^rp(A)=p(A)A^r$). But again the vectors $A^rv$ are a basis, so $B=p(A)$ ($Bx=Cx$ for all $x$ in a basis implies $Bx=Cx$ for all $x$ in the vector space, which implies $B=C$).


See my answer at Given a matrix, is there always another matrix which commutes with it? and
cyclic vectors - cyclic subspaces

From hardmath, proof of cyclic vector theorem