Are translations of a polynomial linearly independent?
Solution 1:
Consider the matrix of coefficients, where each row corresponds to an evaluation at a point, and each column to a power of $x$. Any linear dependency among the columns amounts to some degree~$n$ polynomial in the point, which is supposed to have $n+1$ roots. Since this can't happen, the polynomials are independent.
Example (almost Vandermonde) - $P(x) = x^2$: $$\begin{pmatrix} 1 & 2a & a^2 \\ 1 & 2b & b^2 \\ 1 & 2c & c^2 \end{pmatrix}$$ A linear dependency of the columns is a quadratic polynomial having three different roots $a,b,c$.
EDIT: Just to clarify what happens in the general case. Here's the matrix corresponding to $P(x) = x^2 + \alpha x + \beta$: $$\begin{pmatrix} 1 & 2a + \alpha & a^2 + \alpha a + \beta \\ 1 & 2b + \alpha & b^2 + \alpha b + \beta \\ 1 & 2c + \alpha & c^2 + \alpha c + \beta \end{pmatrix}$$ Since the degree of the polynomial in column $t$ is $t$ (if we start counting from zero), it's easy to see that the any non-zero combination of columns results in a non-zero polynomial (consider the rightmost non-zero column).
Solution 2:
Here a fancy proof that uses operators and differentiation.
Let $T_a$ be the translation operator which maps polynomials to their translates
$$ T_a p(x) = p(x + a) .$$
Restricting attention to polynomials of degree $n$ or lower, this is a map from an $n+1$-dimensional vector space to itself. The question to be resolved is whether the $n+1$ translation operators $T_{a_0}, T_{a_1}\dots, T_{a_n}$ are linearly independent (when applied to polynomials of degree $n$).
It is well-known that the translation operator can be written as the exponentiation of the differentiation operator $Dp(x) := p'(x)$ as follows
$$ T_a = e^{aD} = 1 + \frac1{1!} aD + \frac1{2!} a^2D^2 + \dots + \frac1{n!} a^n D^n .$$
This is Taylor's theorem. Note that since $D$ is nilpotent on the space of polynomials of degree $n$, $D^{n+1} = 0$, the series is actually a finite sum.
Now, the point is that for any polynomial $p(x)$ of degree $n$, we know that its derivatives $p(x), Dp(x), D^2p(x), \dots, D^n p(x)$ are linearly independent because the degree always decreases by 1. Hence, they form a basis of our vector space of polynomials. The expansion of the $T_a$ in terms of this new basis is given by the expression above.
To show that the translates $T_{a_k} p(x)$ are linearly independent, we only have to check that their matrix with respect to the new basis is nonsingular. It reads
$$\begin{pmatrix} 1 & 1 & \dots & 1 \\ a_0 & a_1 & \dots & a_n \\ \frac1{2!} a_0^2 & \frac1{2!} a_1^2 & \dots & \frac1{2!} a_n^2 \\ \vdots \\ \frac1{n!} a_0^n & \frac1{n!} a_1^n & \dots & \frac1{n!} a_n^n \\ \end{pmatrix}$$
But its determinant is clearly a non-zero multiple of the Vandermonde determinant, which is non-zero if and only if the values $a_0,a_1,\dots,a_n$ are pairwise distinct.
(Another way to see that this matrix is non-singular is to note that the Vandermonde determinant is related to the interpolation problem, whose solution is unique.)