Prove that $A^k = 0 $ iff $A^2 = 0$

Let $A$ be a $ 2 \times 2 $ matrix and a positive integer $k \geq 2$. Prove that $A^k = 0 $ iff $A^2 = 0$.

I can make it to do this exercise if I have $ \det (A^k) = (\det A)^k $. But this question comes before this.

Thank you very much for your help!


The solution using the minimal polynomial and Cayley-Hamilton is a bit of an over-kill and somewhat of a magic solution. I prefer the following non-magic solution (for the non-trivial implication, and there is no need to assume the matrix is $2\times 2$).

Think of $A$ as a linear operator $A: V \to V$ with $V$ an $n$-dimensional vector space, and suppose $A^t=0$. We'll show that $A^n=0$. Now, for each $m\ge 1$ consider the space $K_m$, the kernel of $A^m$. It is immediate that $K_{m}\subseteq K_{m+1}$. Since $A^t=0$ it follows that $K_t=V$. It is also immediate that if $K_m=K_{m+1}$, then $K_m=K_{r}$ for all $r>m$.

Thus, the sequence of kernels is an increasing sequence of subspaces, that stabilizes as soon as the next step is equal to the previous one, and it is eventually all of $V$. Now we will use the fact that $V$ is of dimension $n$. Considering the dimensions of the kernels, the above implies that the sequence of dimensions is strictly increasing until it stabilizes. Since it reaches $V$, the dimensions reach $n$ and since the dimension of $K_1$ is not zero, the sequence of dimensions starts at $1$ or more. That means it has to stabilize after no more than $n$ steps, and thus $K_t=V$ for some $t\le n$. But then $K_n=V$, and thus $A^n=0$.


The standard and more general approach to solve this kind of problems is to consider the minimal polynomial of $A$ (see the other answers here). However, in this particular problem, as $A$ is only $2\times2$, one may solve the problem as follows. As the forward implication is trivial, we consider only the backward implication

Proof 1. If $A^k=0$ for some $k\ge2$, then $A$ is singular and it has at most rank 1. Therefore, $A=uv^T$ for some vectors $u$ and $v$. Note that for any $n\ge1$, $$ A^n=\underbrace{(uv^T)(uv^T)\cdots(uv^T)}_{n \text{ times}} =u\underbrace{(v^Tu)}_{\text{scalar}}\cdots(v^Tu)v^T=(v^Tu)^{n-1}A. $$ In particular, $A^k=(v^Tu)^{k-1}A$ and $A^2=(v^Tu)A$. Now the result follows immediately.

Proof 2. If $A^k=0$ for some $k\ge2$, then $A$ is singular. Let $Ax=0$ for some $x\ne0$. Extend $x$ to a basis $\{x,y\}$ of the underlying vector space. Let $Ay=px+qy$. By mathematical induction, we have $A^ny=pq^{n-1}x + q^ny$ for any $n\ge1$. Now use $A^k=0$ to argue that $q=0$. Hence $A^2x=A^2y=0$.


One direction is trivial, so let's consider the other.

I will give you some hints towards a solution.

Suppose $A^k = 0$. This means that $A$ satisfies the polynomial $x^k$. What does this say about the minimal polynomial of the matrix? What in turn does this say about the characteristic polynomial? Finally, consider the Cayley-Hamilton theorem.