Can some proof that $\det(A) \ne 0$ be checked faster than matrix multiplication?

We can compute a determinant of an $n \times n$ matrix in $O(n^3)$ operations in several ways, for example by LU decomposition. It's also known (see, e.g., Wikipedia) that if we can multiply two $n \times n$ matrices in $M(n)$ steps, then we can compute the determinant in $O(M(n))$ steps as well.

However (and this is the motivating observation here), as in this question, if $\det(A) = 0$, then I can find a vector $\mathbf x$ such that $A \mathbf x = \mathbf 0$, and tell you: "$A$ is a singular matrix. Here is a vector $\mathbf x$ such that $A \mathbf x = \mathbf 0$". I might have done lots of work to find $\mathbf x$, but you can check my work in only $O(n^2)$ steps by computing $A \mathbf x$: faster than you could compute $\det(A)$ without help.

Is it possible, in a similar way, for me to take a matrix $A$ with $\det(A) \ne 0$, and write a proof of this fact which you can also check faster than computing $\det(A)$? (A perfect solution would check the proof in $O(n^2)$ steps; this is best possible, since we need that many steps to even read $A$.)


Observations:

  • A probabilistic argument exists based on Freivalds's algorithm: I give you $A^{-1}$, and leave you to check that $AA^{-1} = I$. As far as we know, this still needs $O(M(n))$ time to do deterministically, but a probabilistic algorithm can take $O(n^2)$ steps to achieve a one-sided error rate of $\frac12$: if $A^{-1}$ is correct, it will always say "yes", and if $A^{-1}$ is wrong, it will say "no" with probability at most $\frac12$. As a result, you can take $O(n^2\log n)$ steps to achieve one-sided error rate of $n^{-k}$ for any $k$.

  • More generally, we could ask for a proof that $\det(A) = x$ for any specific nonzero value of $x$. This was the original question, but there's no hope of solving that for general $x$ if we can't even solve the $\det(A) \ne 0$ case. (After all, a proof that $\det(A)$ has a specific nonzero value $x$ is in particular a proof that $\det(A)$ has some nonzero value.)


Just to share another possible alternative, a randomised algorithm expanded from Weidemann provides a method to sample solution from the solution (affine) subspace $Ax =b$ where $A$ is a square matrix and $b$ is in the column space of $A$. So in this case, one can choose $b = 0$ and the algorithm should sample uniformly from the nullspace of $A$ for finite fields. The idea is to run the algorithm a few times for the same input $A$ and let $b= 0$. It will return you some $x$ and if $x$ is not a zero vector for any of those trails, then we know that $A$ is not invertible.

The situation on infinite fields is not so transparent; but it should yield relatively comparable results if you choose a large enough sampling set.

This algorithm claims to take $O(n^2 \log n \log \log n)$ computations in the base field and it requires the base field to be sufficiently large, else, take a field extension. The success probability of this algorithm is quite high so with only finite repetition, I think it is possible to determine whether the matrix is invertible quicker than $O(n^3)$

Edit: Another alternative is to consider the minimal polynomial of the linearly recurrent sequence $\{u^\top A^i b\}$. (A sequence $s_i$ in a $F$ -vsp is linearly reccurent if there exists $n \in \mathbb{N}$ and $a_0,\dots,a_{n-1} \in F$ such that $\sum_{i=0}^{n-1} a_is_{j+i}$ for all $j \in \mathbb{N}$. An example of a linearly recurrent sequence would be the sequence $A^i$ where $A$ is a $n\times n$ matrix, and it is not difficult to see why $\{u^\top A^ib\}$ is also linearly recurrent. The minimal polynomial of a linearly recurrent sequence is just the polynomial $\sum_{i=0}^{n-1} a_ix^i$.)

Now, another result in the paper by Kaltofen and Saunders (which is the paper above) is that one can 'guess' the minimal polynomial of $A$ well enough using the minimal polynomial of the sequence $u^\top A^i b$ by choosing $u$ and $b$ randomly, provided that the base field is sufficiently large. (But in fact, one would always get a factor of the polynomial of $A$).

One may ask why is even is the sequence $u^\top A^i b$ relevant. This is because it is easy to find the minimal polynomial of this sequence of elements in the base field via a dynamic algorithm, the Berlekamp - Massey algorithm. One can just pick randomly sample $u,b$ and find the minimal polynomial of the sequence $u^\top A^i b$.

But now the issue is that one requires a sequence of length $2d$ where $d$ is the degree of the minimal polynomial of $A$, which is unknown and reasonable guess is $d = n$. I am doubting if this method is even sub-$O(n^3)$ since the algorithm relies on Berlekamp Massey.