Are there open problems in Linear Algebra?

I'm reading some stuff about algebraic K-theory, which can be regarded as a "generalization" of linear algebra, because we want to use the same tools like in linear algebra in module theory.

There are a lot of open problems and conjectures in K-theory, which are "sometimes" inspired by linear algebra.

So I just want to know:

What are open problems in "pure" linear algebra? (Pure means not numerical!)

Thanks


Solution 1:

One of the biggest questions is one of the simplest to understand: what is the lowest bound for the operation count of matrix-matrix multiplication? Or, in other words,

Given two $n\times n$ matrices, what is the lowest bound of the exponent in the computational complexity of their product?

The conjecture could be made more bold:

Does there exist an algorithm that can compute the product of two $n \times n$ matrices with complexity $O(n^2)$?

Currently, the lowest known bound for the exponent is about $2.373$, obtained from an optimization on the Coppersmith-Winograd algorithm, which isn't actually used because it's only efficient for matrices that are so large that they're (currently) not encountered in practice.

Some folks (citation needed) suspect that for sufficiently large $n$, an algorithm exists that can compute the product in $O(n^2)$ operations.

Solution 2:

This problem arises from control theory, but it is actually a linear algebra problem. Static Output Feedback Stabilization Problem: Given the matrices $A \in \mathbb{R}^{n \times n}$, $B \in \mathbb{R}^{n \times m}$ and $C \in \mathbb{R}^{p \times n}$ is there exist a matrix $K \in \mathbb{R}^{m \times p}$ such that real part of all eigenvalues of the matrix $A+BKC$ are negative.