Generalizing Cauchy-Schwarz for more than two vectors

For a complex inner product space, $X$, Cauchy-Schwarz inequality states $$ | \langle x,y \rangle |^2 \leq \langle x,x\rangle \cdot \langle y, y\rangle , $$ for any $x,y \in X$. Equality holds if and only if $x$ and $y$ are linearly dependent. I noticed that this can be restated as: $$ \left|\begin{array}{cc} \langle v_1, v_1 \rangle & \langle v_1, v_2\rangle \\ \langle v_2, v_1 \rangle & \langle v_2, v_2\rangle \\ \end{array}\right| \geq 0$$ with strict equality if $\{v_i \}$ is linearly independent. Does this (somehow) generalize for $n$ vectors? That is, does the following hold: $$ \left|\begin{array}{cccc} \langle v_1, v_1 \rangle & \langle v_1, v_2\rangle & \cdots &\langle v_1, v_n \rangle \\ \langle v_2, v_1 \rangle & \langle v_2, v_2\rangle & \cdots &\langle v_2, v_n \rangle \\ \vdots & \vdots & \ddots & \vdots \\ \langle v_n, v_1 \rangle & \langle v_n, v_2\rangle & \cdots &\langle v_n, v_n \rangle \end{array}\right| \geq 0$$

At the very least, can we prove that the above determinant is non-zero if $\{v_i \}$ is linearly independent? I came across this working on a functional analysis problem set, but this isn't a homework problem.


EDIT: For those tagging it as a duplicate, I see this as different because this question specifically concerns inequality, and not just proving that the determinant is non-zero if they are linearly independent. Additionally, this post specifically suggests a connection to Cauchy-Schwarz that isn't mentioned in the other post.

As a commenter (Algebraic) pointed out, this matrix is called the Gram matrix of the vectors $\{v_i\}$; Wikipedia states that this matrix if positive semi-definite, and is positive definite in the case where they are linearly independent. This proves that the determinant is indeed greater than or equal to zero for arbitrary $\{v_i\}$ and is strictly positive in the case where the $\{v_i\}$ are linearly independent.


Solution 1:

Here is an easy argument. Let $x$ be the matrix $$ x=\begin{bmatrix}v_1&v_2&\cdots&v_n\end{bmatrix}. $$ Then $$ x^*x=\begin{bmatrix} v_1^*v_1&v_1^*v_2&\cdots&v_1^*v_n\\ v_2^*v_1&v_2^*v_2&\cdots&v_2^*v_n\\ \vdots & \vdots & \ddots & \vdots \\ v_n^*v_1&v_n^*v_2&\cdots&v_n^*v_n\\ \end{bmatrix} =\begin{bmatrix} \langle v_1, v_1 \rangle & \langle v_1, v_2\rangle & \cdots &\langle v_1, v_n \rangle \\ \langle v_2, v_1 \rangle & \langle v_2, v_2\rangle & \cdots &\langle v_2, v_n \rangle \\ \vdots & \vdots & \ddots & \vdots \\ \langle v_n, v_1 \rangle & \langle v_n, v_2\rangle & \cdots &\langle v_n, v_n \rangle \end{bmatrix}. $$ As $x^*x$ is positive-semidefinite, $\det x^*x\geq0$.

If $v_1,\ldots,v_n$ are linearly dependent, there exist coefficients, not all zero, with $c_1v_1+\cdots+c_nv_n=0$. We can write this as $xc^*=0$ with $c\ne0$. But then $x^*xc=0$, and so $x^*x$ has a kernel, and $\det x^*x=0$.

Conversely, if $\det x^*x=0$ it means that there exists nonzero $c$ with $x^*xc^*=0$. But then $(xc^*)^*xc^*=cx^*xc^*=0$, so $xc^*=0$ and $v_1,\ldots,v_n$ are linearly dependent.