Show that a matrix $A$ is singular if and only if $0$ is an eigenvalue.

$A$ singular $\iff\det(A)=0\iff\det(A-0\cdot I)=0\iff 0$ is eigenvalue of $A$.

Michael


Note that, the determinant of $n\times n$ matrix $A$ can be computed using the eigenvalues as

$$ |A|=\lambda_1\lambda_2\dots\lambda_n ,$$

which is the product of the eigenvalues.


We know that $0 \in \lambda(A)$ iff there exists some nonzero solution to the eigenvector equation $Ax = \lambda x = 0\cdot x = 0$. Thus $0$ is an eigenvalue iff $\exists b \in \mathrm{Ker}(A)$ with $b \neq 0$. But since $\mathrm{Ker}(A) \neq \{ 0 \}$ we conclude that $A$ must be singular.


Assuming that by “singular”, you mean a square matrix that is not invertible:

Lemma: If $A$ is invertible and $\lambda$ is an eigenvalue of $A$, then $\frac{1}{\lambda}$ is an eigenvalue of $A^{-1}$.

Let $x$ be the eigenvector of $A$ corresponding to eigenvalue $\lambda$. By definition, $Ax = \lambda x$. Left-multiply by $A^{-1}$, giving $A^{-1}Ax = A^{-1} \lambda x$. The LHS is equal to $x$ ($A^{-1}A = I$ by definition) and the RHS is equal to $\lambda A^{-1} x$ (because matrix × scalar commutes), so $x = \lambda A^{-1} x$. Divide both sides by $\lambda$, giving $\frac{1}{\lambda} x = A^{-1} x$. By definition, $\frac{1}{\lambda}$ is thus an eigenvalue of $A^{-1}$.


So, if $0$ were an eigenvalue of $A$, then $\frac{1}{0}$ would be an eigenvalue of $A^{-1}$. But $\frac{1}{0}$ isn't a number, so $A^{-1}$ can't exist either.