What is the most efficient way to determine if a matrix is invertible?
Solution 1:
Gauss-Jordan elimination can be used to determine when a matrix is invertible and can be done in polynomial (in fact, cubic) time. The same method (when you apply the opposite row operation to identity matrix) works to calculate the inverse in polynomial time as wel.
Solution 2:
A matrix is invertible iff its determinant is non-zero.
There are algorithms which find the determinant in slightly worse than O(n2)
Solution 3:
Computing the determinant and Gaussian elimination are both fine if you are using exact computations, for instance if the entries of your matrix are rational numbers and you are using only rational numbers during the computations. The disadvantage is that the numerator and denominator can get very large indeed. So the number of operations may indeed be O(n2.376) or O(n3), but the cost of every addition and multiplication gets bigger as n grows because the numbers get bigger.
This is not an issue if you are using floating point numbers, but then you have the problem that floating point computations are not exact. Some methods are more sensitive to this than others. In particular, checking invertibility by computing the determinant is a bad idea in this setting. Gaussian elimination is better. Even better is to use the singular value decomposition, which will be treated towards the end of the MIT course.