Proof of Vandermonde Matrix Inverse Formula

I'm working through Exercise 40 from section 1.2.3 of Knuth's The Art of Computer Programming volume 1, but am finding myself unable to produce a rigorous proof, and the one here is suspect and not quite clear enough (for me, at least) in some of the steps; in particular, how to "[identify] the $k$th order coefficient in [the] two polynomials."

The problem is this: given a Vandermonde matrix $[x_j^i]_n$, show that the inverse is given by

$$ [b_{ij}]_n = \left[ \frac{ \sum_{\substack{1 \leq k_1 < \dotsc < k_{n-j} \leq n\\k_1,\dotsc,k_{n-j} \neq i}} (-1)^{j-1} x_{k_1} \dotsc x_{k_{n-j}} }{ x_i \prod_{\substack{1 \leq k \leq n\\k \neq i}} (x_k - x_i) } \right]_n\text{.} $$

The author gives a hint by stating that the sum in the above numerator is just the coefficient of $x^{j-1}$ in the polynomial $(x_1-x)\dotsc(x_n-x)/(x_i-x)$; and he gives an intermediate result showing the explicit multiplication of the matrix and its inverse as

$$ \sum_{1 \leq t \leq n}b_{it}x_j^t = \frac{ x_j \prod_{\substack{1 \leq k \leq n\\k \neq i}} (x_k - x_j) }{ x_i \prod_{\substack{1 \leq k \leq n\\k \neq i}} (x_k - x_i) } = \delta_{ij}\text{.} $$

The only other hint as to the type of solution he was expecting is a reference to A. de Moivre's The Doctrine of Chances, 2nd edition, pp. 197-199, which deals with polynomial recurrence relations and difference products (available here).

At the least, I was just hoping someone could either verify the proof is correct at proofwiki and possibly fill in exactly how one identifies the $k$th order coefficient in the proof; or perhaps explain a proof strategy as to what steps to take where the intermediate result is obtained at some point before the final result.

Thanks so much for any help.


I've just done some editing on the proof of Vandermonde's matrix inverse on proofwiki:

https://proofwiki.org/wiki/Inverse_of_Vandermonde_Matrix

I hope that was what you were asking for :)