Are most matrices invertible? [duplicate]

Solution 1:

There are at least three ways of saying that a matrix over the real numbers is generically invertible:

The topological one: the set of invertible matrices is a dense open set in the set of all matrices.

The probabilistic one: with the Lebesgue measure on the set of matrices, the non-invertible matrices are of measure zero.

The algebraic one: The set of invertible matrices is open (and non-empty) for the Zariski topology; explicitly, this means that there is a polynomial defined on the coefficients of the matrices, such that the set of invertible matrices is exactly the set where this polynomial is not zero. Of course, here the polynomial is the determinant function.

Remark that your question makes sense for matrices with coefficients in an arbitrary infinite field (for finite fields, we are looking at finite sets...) and that we can still say that a matrix is generically invertible in the algebraic sense.

Solution 2:

Yes. The non-invertible matrices form a set of measure zero in the space of all $n\times n$ matrices, when considered as a subset of $\mathbb R^{n^2}$.

Solution 3:

To expand on Potato's answer, the space of all $n \times n$ matrices is manifold isomorphic to $\mathbb{R}^{n^2}$. The determinant gives an polynomial function from this space to $\mathbb{R}$, and the zero set of a polynomial function has measure $0$.

(Earlier this asserted that the set of singular matrices is a manifold, which is not true.)

Solution 4:

Here's how I like to think about it. A matrix $A$ is invertible $\iff$ $det(A) \neq 0$. Since the determinant depends continuously on the entiries, if we have $A$ such that $det(A)=0$, we should be able to perturb the entries slightly so that $det(A)$ is nonzero. Likewise, if $det(A) \neq 0$, then any small perturbation of the entries won't make $det(A)$ suddenly $0$. So in this sense, most matrices are invertible.

Solution 5:

I think this needs work, but there is something to it: take a randomly-selected matrix $A$, and associate to it its characteristic polynomial $Det(A -\lambda I)$; this polynomial is also random. This polynomial will have $0$ as its root iff its constant term is 0. But most random polynomials will have non-zero constant term, so will not have zero as their root. Alternatively, consider a characteristic polynomial with a zero root. Any small change in an entry of the matrix will change the constant term from $0$ to non-zero. This last statement can be made more accurate/rigorous by using topology: the set of matrices with determinant non-zero is an open subset of $Gl(n,\mathbb R)$, since it is the complement in $\mathbb R^{n^2}$ of the closed set $Det^{-1}(0)$, so that each non-zero matrix, seen as a point in Euclidean space, has a radius around which its determinant is non-zero.

For low dimensions, you can see this geometrically: for a $2 \times 2$-matrix, its determinant is $0$ iff "the entries are on the same line through the origin" , meaning that the points $(a_{11}, a_{12})$ and $(a_{21},a_{22})$ are on the same line through the origin. There are $c$ possible slopes, and only one way of both points being on the same line through $0$. For a $ 3 \times 3$-matrix, there is a similar argument, seeing the matrix as the measure of the volume enclosed by $3$ vectors in $\mathbb R^3$. The determinant is $0$ here if the rows, seen as points, are "degenerate". The degenerate cases are finitely-many compared with the non-degenerate ones.