How to tell is a matrix is a covariance matrix?

How can we know that these matrices are valid covariance matrices? $$ C= \begin{pmatrix} 1 & -1 & 2 \\ -1 & 2 & -1 \\ 2 & -1 & 1 \\ \end{pmatrix} \\ C= \begin{pmatrix} 4 & -1 & 1 \\ -1 & 4 & -1 \\ 1 & -1 & 4 \\ \end{pmatrix} $$

I know that

  1. $C_{xy}=C_{yx}$ (is symmetric)
  2. $C_{xx}= \mathrm{Var}[x] \ge 0$ (the diagonal entries are all positive)

But I want to know, are there any other properties?


A square matrix is a covariance matrix of some random vector if and only if it is symmetric and positive semi-definite (see here). Positive semi-definite means that $$ x^{T}Cx\ge0 $$ for every real vector $x$, where $x^T$ is the transpose of the vector $x$.

We have that \begin{align*} x^TC_1x &=\left(\begin{array}{ccc}x_1 & x_2 & x_3\end{array}\right)\left(\begin{array}{ccc}1 & -1 & 2 \\-1 & 2 & -1 \\2 & -1 & 1\end{array}\right)\left(\begin{array}{c}x_1 \\x_2 \\x_3\end{array}\right)\\ &=\left(\begin{array}{ccc}x_1-x_2+2x_3 & -x_1+2x_2-x_3 & 2x_1-x_2+x_3\end{array}\right)\left(\begin{array}{c}x_1 \\x_2 \\x_3\end{array}\right)\\ &=x_1^2+2x_2^2+x_3^2-2x_1x_2+4x_1x_3-2x_2x_3\\ &=(x_1-x_2+x_3)^2+x_2^2+2x_1x_3. \end{align*} If we take $x^T=\left(\begin{array}{ccc}1 & 0 & -1\end{array}\right)$, then $x^TC_1x=-2$. Hence, $C_1$ is not a covariance matrix.

We have that \begin{align*} x^TC_2x &=\left(\begin{array}{ccc}x_1 & x_2 & x_3\end{array}\right)\left(\begin{array}{ccc}4 & -1 & 1 \\-1 & 4 & -1 \\1 & -1 & 4\end{array}\right)\left(\begin{array}{c}x_1 \\x_2 \\x_3\end{array}\right)\\ &=\left(\begin{array}{ccc}4x_1-x_2+x_3 & -x_1+4x_2-x_3 & x_1-x_2+4x_3\end{array}\right)\left(\begin{array}{c}x_1 \\x_2 \\x_3\end{array}\right)\\ &=4x_1^2+4x^2+4x_3^2-2x_1x_2+2x_1x_3-2x_2x_3\\ &=3(x_1^2+x^2+x_3^2)+(x_1-x_2+x_3)^2\ge0. \end{align*} Hence, $C_2$ is a covariance matrix.