Find the roots of a polynomial using its companion matrix

Since $p(x)$ is biquadratic, then if $\alpha$ is root, it follows that $-\alpha$ also is a root.

Looking at $M$ you can notice that if you sum the entries of each column, you'll always get $1$. This implies $1$ is an eigenvalue. (Do you know why?).

You have two roots now.

Continue with long division to find the remaining roots.

If you want to use the matrix to find all eigenvalues, recall that $\det (M)$ is the product of all eigenvalues. You can easily compute $\det (M)$ through expansion along the fourth column to find $\det (M)=9$.

Use the first sentence in my answer again to find the other eigenvalues.


It's no surprise you got the characteristic equation from the companion matrix, because the equation came from that precise matrix.

If you want to find roots of a polynomial using a companion matrix, you would use one of the specialized numerical methods for computing the eigenvalue of a matrix. Such methods do not use the characteristic polynomial and are tailored to the symmetry of the matrix. See, for example, Numerical Recipes (the link goes to the Householder Method, which is not applicable to your matrix, but gives you the general idea of what's involved).

Typically, you would use a package to find eigenvalues numerically because in general, such routines are very complex and not worth trying to figure out on your own. LINPACK, Matlab, Mathematica, etc., all lhave such routines for general matrices.


There exist numerical methods to find eigenvalues of matrices which use matrix - vector multiplication. Matrix-vector multiplication is very fast if the matrix is sparse as it is in this case.

One popular method is the power method. Roughly speaking, taking a random vector $\bf r$ and calculating ${\bf M}^k{\bf r}$ and ${\bf M}^{k+1}{\bf r}$ for a large $k$. Then the "parts" of $\bf r$ corresponding to the largest eigenvector should grow faster (exponentially faster with $k$) so that by elementwise division of those two vectors we can calculate the eigenvalue corresponding to that vector. Then there exist ways to sequentially "remove" or factor out parts of an eigenspace.

So a method doing this could look something like

  1. Generate $\bf r$ from a random distribution.
  2. Calculate ${\bf M}^k{\bf r}$ and ${\bf M}^{k+1}{\bf r}$ and divide them element wise to get largest eigenvalue.
  3. Factor out the eigenvalue-eigenvector pair by some method.
  4. Loop from 1 with a new random vector to get the second largest eigenvalue and so on until you have all you want.

One way to do the factorization step is described in "Matrix Analysis" by Horn and Johnson. Don't have the book right by me at the moment but can add a page number later if you are interested.