Does every linear operator have a minimal polynomial?

I know that a linear operator $T$ defined on a finite-dimensional vector space has a minimal polynomial since, by Caley-Hamilton, $g(T)=0$, where $g$ is the characteristic polynomial. Is there a linear operator defined on an infinite-dimensional vector space that has no minimal polynomial?


Solution 1:

I would say most linear operators don't have a minimal polynomial. Take for example the space $\mathbb{R}^{\mathbb{N}}$ of all real sequences and define $$T((u_n)_{n\ge 0}) = (n u_n)_{n\ge 0}$$ Every integer is an eigenvalue of $T$, therefore there is no minimal polynomial.

Another simple example is the derivation operator $D(P)= P^\prime$ on the space $\mathbb{R}[X]$ of all real polynomials. If there was a minimal polynomial, all polynomials would be solutions of the same linear differential equation with constant coefficients.

Solution 2:

Let $X$ be a complex Banach space and let $T:X \to X$ a bounded linear operator. Let $p$ be a polynomial. The spectral mapping theorem says:

$p(\sigma(T))= \sigma(p(T))$.

($\sigma(*)=$ spectrum)

If we suppose that $p(T)=0$, then it follows that $\sigma(T)$ consists of zeros of $p$, hence $\sigma(T)$ is finite.

Consequence: if $\sigma(T)$ is not finite, then there is no polynomial $P$ such that $p(T)=0$.

An Example: $X=l^2$, $T(x_1,x_2,....)=(0,x_1,x_2,...)$. We have $\sigma(T)=\{ z \in \mathbb C: |z| \le 1\}$.

Solution 3:

You don't need anything complicated to show that in infinite dimensional space minimal polynomials do not exist in general. Take the simplest infinite dimensional space $K[X]$, and consider the linear operator $\phi$ of multiplication by any non-constant polynomial, say $X$ to keep it a simple as possible. It is easy to see that $P[\phi]$ is multiplication by $P[X]=P$ for any $P\in K[X]$, so this is the zero operator only if $P=0$.

(The more general case of a non-constant polynomial $Q$ in the place of $X$ requires you to think about what $P[Q]$ is, the result of substituting $Q$ for $X$ into $P$, and notably when it is$~0$. But one easily sees the (leading) coefficient of $X^{(\deg P)(\deg Q)}$ in $P[Q]$ is nonzero when $P\neq 0$, so there is no minimal polynomial here either.)

Note that these operators have no eigenvalues at all either: multiplication of a nonzero polynomial $S$ by a non-constant polynomial never gives a scalar multiple of$~S$, by degree consideration. So here, contrary to the examples by Fred and Gribouillis, it is certainly not the too great abundance of eigenvalues (as eigenvalues have to be among the roots of any annihilating polynomial) that is obstructing the existence of a minimal polynomial.

Solution 4:

You can consider the linear space $X$ of all complex continuous functions on $[0,1]$ and let $Tf=\int_{0}^{x}f(t)dt$ be the integration operator, which is definitely linear on $X$. The operator $T$ has no minimal polynomial. To see why, first notice that $(T-\lambda I)$ has trivial kernel for $\lambda \ne 0$ because $Tf=\lambda f$ forces $g(x)=\int_{0}^{x}f(t)dt$ to satisfy $$ g'(x)=\frac{1}{\lambda}g(x),\;\; g(0)=0, $$ which has the unique solution $g(x)\equiv 0$.

Therefore, if $T$ has a minimal polynomial, it could only have the form $T^n=0$ for some $n$ which would force $g=T^nf$ to be identically $0$ for all $f$, and differentiating $n$ times would give $f=0$ for all $f$.