Has there been a rigorous analysis of Strassen's algorithm?

I suggest that you have a look at the easily readable original article here (probably you need a university subscription), and the MR review is here. Since Strassen's algorithm is quite famous, I would be surprised if it were difficult to find a detailed analysis in various dialects by googling a little.

The roughly is in fact precisely $\log_{2}{7} \approx 2.807$.

In order to multiply two $n \times n$ matrices in the usual way you need $n^{3}$ multiplications and $n^{2}(n-1)$ additions, so in total you need $n^2 (2n-1)$ arithmetical operations. Strassen's algorithm (in his own analysis) gets away with less than $4.7 n^{\log_{2} 7}$ arithmetical operations (provided $n \geq 16$ if I'm not completely mistaken), and this should be better for all such $n$.

I don't see any other "costly" thing than addition and multiplication in an implementation. The amount of memory needed should also be rather easy to figure out from Strassen's description, but it should be of order $4 n^2$.


From a google search [crossover strassen matrix multiplication], I've found that experiments have found the crossover point to be from n = 8 to n = 20.

However, at some point in analysis of algorithms you jump from manipulating mathematical concepts straight to just running things experimentally and recording values and times. At that point it is not really called analysis of algorithms, but instead plain old engineering where you manipulate things other than the basic algorithm (agressively optimizing compiling, caching strategies, the speed of memory components, hardware parallelization, the cooling system, etc). Then the variables to consider are too many and underspecified to make an analytic (on paper) calculation, and you just have to compare using runtime data.