Can every semidefinite program be solved in polynomial time?

Solution 1:

Sort of, if you mean complexity in terms of number of operations. In a stronger notation where you take representation of the involved numbers into account, I think complexity is still unknown if memory serves me right (Ramana 1995, An Exact Duality Theory for Semidefinite Programming and its Complexity Implications)

A related example

\begin{equation} \min_y {y_n} \text{ subject to } {y_1 = 1, \begin{bmatrix} y_i & y_{i-1}\\y_{i-1} & \frac{1}{2} \end{bmatrix}\succeq 0,~i = 2\ldots n} \end{equation}

Nice simple trivial SDP. However, the optimal solution here will be $y_i = 2^{2^{i-1}-1}$, i.e. if you represent numbers in your solution algorithm with standard integers (of course not possible in general), the number of bits required to represent the solution will be exponential in $n$. Same if you use standard floating-point representation as the power still is exponential.

Solution 2:

Primal-dual interior-point methods can be used to solve SDP's in a polynomial number of iterations. If you count arithmetic operations (and ignore the bit level complexity of operations on arbitrary precision numbers) then each iteration can be done in polynomial time. This model of computation most accurately reflects practical computation with floating point arithmetic.

Another minor technical caveat is that the SDP needs to satisfy a constraint qualification such as Slater's condition.

See

Yurii Nesterov and Arkadii Nemirovskii, Interior-Point Polynomial Algorithms in Convex Programming, SIAM, 1993.