How to check if a symmetric $4\times4$ matrix is positive semi-definite?

  1. How does one check whether symmetric $4\times4$ matrix is positive semi-definite?

  2. What if this matrix has also rank deficiency: is it rank 3?


Another method is to check there are no negative pivots in row reduction (after taking into account the possibility of 0's on the diagonal). The procedure can be written recursively as follows:

1) If $A$ is $1 \times 1$, then it is positive semidefinite iff $A_{11} \ge 0$.

Otherwise:

2) If $A_{11} < 0$, then $A$ is not positive semidefinite.

3) If $A_{11} = 0$, then $A$ is positive semidefinite iff the first row of $A$ is all 0 and the submatrix obtained by deleting the first row and column is positive semidefinite.

4) If $A_{11} > 0$, for each $j > 1$ subtract $A_{j1}/A_{11}$ times row 1 from row $j$, and then delete the first row and column. Then $A$ is positive semidefinite iff the resulting matrix is positive semidefinite.


Since the matrix is symmetric, the eigenvalues will be real. Calculate the eigenvalues and see if they are all $\geq 0$. If this is true,the matrix is positive semidefinite.


As stated above, Sylvester's criterion doesn't work in this case, so you can't simply check the four leading principal minors. However, it does suffice to check that all 15 of the principal minors are nonnegative. See here for a reference.

Another basic approach involves symmetric row reduction. This involves operations of the following type:

  1. Perform a row operation, and then
  2. Immediately perform the corresponding column operation.

For example, you can multiply any row by a constant, as long as you immediately multiply the corresponding column by a constant. Note that this multiplies the diagonal entry by the square of the constant.

Using these operations, you can use a variant of Gaussian elimination to reduce any symmetric matrix to a diagonal matrix with 1's, 0's, and -1's along the diagonal. A matrix is positive semidefinite if and only if the resulting diagonal entries are all 0's and 1's.


Let's say your matrix is $A$.

You can check the eigenvalues. If all eigenvalues $\geq 0$, the matrix is positive semi-definite (if all eigenvalues $>0$ it is positive definite).

It might be possible to use the Gershgorin circle theorem instead of calculating the eigenvalues explicitly. If all the diagonal elements are positive and are larger than or equal to the sum of the absolute values of the other elements in same row (or column) (for every diagonal element), then the matrix is positive semi-definite.

You can try to find a simpler semi-definite matrix $B$ such that $B^2 = A$ ($B$ is unique). This is in general done using the diagonalization of the matrix, so it will probably be easier just calculating the eigenvalues.

You can not use a modification of Sylvester's criterion ("all leading principal minors are non-negative") to determine positive semidefiniteness.

If $A$ is $4 \times 4$ and rank 3, it has 0 as an eigenvalue (since there exists a vector $v$ such that $Av = 0v$). This does not affect the positive semi-definiteness of the matrix, but it will not be positive definite.