Vandermonde Determinant

The hint that Stewart is proving is basically asking you to prove that the Vandermonde determinant above is a scalar multiple of $$\prod_{1 \leq j < k \leq n } (z_k - z_j).$$

Once you have done this, there is no way for the product to be zero as that would mean that some $z_k = z_j$, contradicting the fact that all your complex numbers were distinct to start out with.


More on Stewart's hint: Can you see why the determinant above is a homogeneous polynomial in the variables $z_1,\ldots,z_n$ of degree $1 + 2 + \ldots + (n-1) =\frac{n(n-1)}{2}$? Once you see why it will suffice to add up the degrees of the terms along the main diagonal. Now suppose the Vandermonde determinant above is divisible by the product $\prod_{1 \leq j < k} (z_k - z_j)$.

Then if the degree of the Vandermonde determinant is equal to the degree of that product above, your problem is done by the division algorithm. Now what is the degree of the polynomial $\prod_{1 \leq j < k \leq n } (z_k - z_j)$?

The degree of that polynomial is the number of factors in the product. The number of factors is precisely the number of ways to choose two things from $n$ things without repeats (that's what the $j < k$ condition is saying). This is precisely $$\binom{n}{2} = \frac{n!}{(n-2)!2!} = \frac{n(n-1)}{2}.$$


If you're not entirely convinced of Stewart's approach above here is a proof by induction. Replace all your $z_n's$ above with $x_n's$ (sorry about this). Set $V_n$ to be equal to the $n \times n$ Vandermonde determinant above. We claim that the determinant of the $n \times n$ matrix is given by $$ \text{det}(V_n) = \prod_{1\leq i < j \leq n} (x_j - x_i). $$ We prove by induction: the basis step for $n=2$ is easy enough. So suppose the formula for det$(V_n)$ in terms of a product above holds for $n=k$. Then for $n=k+1$, we perform column operations (and later, row operations) to get

$$\begin{array}{ccccc} \text{det} (V_{k+1}) &=& \text{det} \left[ \begin{array}{ccccc} 1& x_1 & x_1^2& \ldots& x_1^{k} \\ 1 &x_2 & x_2^2 & \ldots& x_2^{k} \\ \vdots &&&& \vdots \\ 1&x_{k+1} & x_{k+1}^2 &\ldots &x_{k+1}^{k} \end{array} \right] \\&=& \text{det}\left[ \begin{array}{ccccc} 1& 0 & 0& \ldots& 0\\ 1 &x_2 - x_1 & x_2(x_2 - x_1) & \ldots& x_2^{k-1}(x_2 - x_1) \\ \vdots &&&& \vdots \\ 1&x_{k+1} - x_1& x_{k+1}(x_{k+1} - x_1) &\ldots &x_{k+1}^{k-1}(x_{k+1} - x_1) \end{array} \right] \\ \\ &=& \prod_{i=2}^{k+1}(x_i - x_1) \text{det} \left[ \begin{array}{ccccc} 1& 0 & 0& \ldots& 0\\ 1 &1 & x_2 & \ldots& x_2^{k-1} \\ \vdots &&& \vdots \\ 1&1& x_{k+1} &\ldots &x_{k+1}^{k-1} \end{array} \right] \\ \\ &=& \Bigg[ \prod_{i=2}^{k+1}(x_i - x_1) \Bigg] \prod_{2 \leq i<j \leq k+1} (x_j -x_i) \quad (\text{ induction hypothesis}) \\ \\ &=& \prod_{1 \leq i<j \leq k+1} (x_j - x_i). \end{array} $$


Consider the cofactor matrix, call it $C$. Each column of this matrix is orthogonal to each "outer column" of the matrix $V$, the Vandermonde matrix. By "outer column" I mean every column except for the column of same index. This is easily seen since the cofactor matrix is the scale and transpose of the inverse.

Let $\mathbf{v}_i$ denote the $i$th column in $V$ and $\mathbf{c}_k$ denote the $k$th column in the cofactor matrix. From the definition of the cofactor matrix we have

$$ \mathbf{v_i}^\top\mathbf{c_k} = \cases{0 \quad &for k $\ne$ i \\ \det(V) & for $k=i$} \tag{1}$$

Consider them as polynomials in variable $x$, $$p_i(x) = C_{0k} +C_{1k}z + C_{2k}z^2 + \dots$$ From (1) we know it has factors $(x - z_k)$ for all $k \ne i$, or equivalently it has one factor

$$\prod_{k\ne i}(x-z_k)$$

Thus, using a scale factor $\Delta$, we may write it as $$p_i(x) = \Delta\prod_{k\ne i}(x-z_k)$$ We also know from (1) that $$p_i(z_i) = \det(V)$$ so $$p_i(z_i) = \det(V) =\Delta\prod_{k\ne i}(z_i - z_k)$$

This is sufficient to show the determinant becomes zero if $z_i = z_k$ for $k \ne i$. If you further solve for $\Delta$ you will find a very familiar formula if you have ever looked at polynomial interpolation...