Matrix inversion via Levi-Civita symbols

There is indeed such a formula. You would think it would be widely known, but in fact the only place I've seen it is in the Penrose Graphical Notation in Penrose's "Road to Reality".

The formula for the determinant (in $n$ dimensions) is: $$\det(M)=\tfrac 1{n!}\varepsilon^{i_1i_2\dots i_n}\varepsilon_{j_1j_2\dots j_n}M^{j_1}_{\quad i_1}M^{j_2}_{\quad i_2}\dots M^{j_n}_{\quad i_n}$$ For the adjugate: $$\mathrm{adj}(M)^i_{\; j}=\tfrac 1{(n-1)!}\;\varepsilon^{ii_2\dots i_n}\varepsilon_{jj_2\dots j_n}M^{j_2}_{\quad i_2}\dots M^{j_n}_{\quad i_n}$$ and the inverse is the quotient of these two: $$(M^{-1})^i_{\; j}=\frac{\mathrm{adj}(M)^i_{\; j}}{\det(M)}$$ (If you don't know how "up and down" indices work, just imagine that they are all down.)

If you think about these for a little bit you should be able to see why they replicate Cramer's formula.


What to do when both indices of $M$ are down

Note that $\varepsilon^{i_1\dots i_n}$ isn't actually a tensor. If it was then considering the change-of-basis matrix $A$ would give us that $$A^{i_1}_{\quad j_1}\dots A^{i_n}_{\quad j_n} \varepsilon^{j_1\dots j_n}=\varepsilon^{i_1\dots i_n}$$ for every matrix $A$. But in fact we have $$A^{i_1}_{\quad j_1}\dots A^{i_n}_{\quad j_n} \varepsilon^{j_1\dots j_n}=\det(A)\varepsilon^{i_1\dots i_n}.$$ So $\varepsilon^{i_1\dots i_n}$ doesn't have the right transformation properties to be a tensor*. Luckily $\varepsilon_{i_1\dots i_n}$ fails to be a tensor in the opposite way: $$(A^{-1})^{j_1}_{\quad i_1}\dots (A^{-1})^{j_n}_{\quad i_n} \varepsilon_{j_1\dots j_n}=\det(A^{-1})\varepsilon_{i_1\dots i_n}$$ and so the product $\varepsilon^{i_1\dots i_n}\varepsilon_{j_1\dots j_n}$ really is a tensor. (Another way to see this is to note that $$\varepsilon^{i_1\dots i_n}\varepsilon_{j_1\dots j_n}=\delta^{[i_1}_{\quad j_1}\dots\delta^{i_n]}_{\quad j_n},$$ where the RHS is clearly a tensor. We could use this expression in the above formulae for $\det$ and $\mathrm{adj}$ if we preferred to not use non-tensors.)

When we wish to invert a matrix with two indices down we begin to worry that we have a problem: $$\det(M)=\tfrac 1{n!}\varepsilon^{i_1\dots i_n}\varepsilon^{j_1\dots j_n}M_{j_1 i_1}\dots M_{j_n i_n}$$ ain't a scalar and $$\mathrm{adj}(M)^{ij}=\tfrac 1{(n-1)!}\;\varepsilon^{ii_2\dots i_n}\varepsilon^{jj_2\dots j_n}M_{j_2 i_2}\dots M_{j_n i_n}$$ ain't a tensor. They transform as: $$\det(M)\mapsto\det(A)^2\det(M)$$ and $$\mathrm{adj}(M)\mapsto\det(A)^2\mathrm{adj}(M)$$ But of course it all works out alright in the end; taking their quotient cancels the $\det(A)^2$s and so the inverse really is a tensor.

*Although note that sometimes our space is equipped with a canonical volume form, and then we can demand that our bases are such that that voulume form is the canonical one for that basis. Then our change-of-basis matrices are restricted to $SL$ and all is good.