$\sum_i x_i^n = 0$ for all $n$ implies $x_i = 0$

Apply Newton's identities. This shows the power sums form a basis for the symmetric polynomials, since the elementary symmetric polynomials are a basis and the elementary symmetric polynomials can be represented as linear combinations of power sums. So all symmetric polynomials evaluate to $0$ at the $x_i$. Consider a polynomial with the $x_i$ as roots. The coefficients are the elementary symmetry polynomials in the $x_i$, so they are all zero. The polynomial is then $x^k$, which shows $x_i=0$ for all $i$.


For a slightly different method than Potato's second answer (but the idea is mainly the same):

Without loss of generality, the system of equations can be written as: $$\left\{ \begin{array}{lcl} \lambda_1x_1 + \lambda_2x_2+ \dots + \lambda_k x_k &= &0 \\ \lambda_1x_1^2 + \lambda_2x_2^2+ \dots + \lambda_k x_k^2 & = & 0 \\ & \vdots & \\ \lambda_1x_1^k + \lambda_2x_2^k+ \dots + \lambda_k x_k^k & = & 0 \end{array} \right.$$

where $\lambda_i>0$, $x_i \neq 0$ and $x_i \neq x_j$ for $i \neq j$. Indeed, if $x_i=x_j$ replace $x_i+x_j$ with $2x_i$ and if $x_i=0$ just remove it. By contradiction, suppose $k \geq 1$.

Now, the family $\{ (\lambda_1 x_1^j, \dots , \lambda_k x_k^j) \mid 1 \leq j \leq k \}$ cannot be linearly independent since the vector space $\{(y_1,\dots, y_k) \mid y_1+ \dots+ y_k=0 \}$ has dimension $k-1$ (it is a hyperplane). Therefore, the matrix

$$A:=\left( \begin{matrix} \lambda_1x_1 & \lambda_2x_2 & \dots & \lambda_k x_k \\ \lambda_1x_1^2 & \lambda_2x_2^2 & \dots & \lambda_k x_k^2 \\ \vdots & \vdots & & \vdots \\ \lambda_1x_1^k & \lambda_2x_2^k & \dots & \lambda_k x_k^k \end{matrix} \right)$$

is not invertible. Using Vandermonde formula, $$0= \det(A)= \prod\limits_{i=1}^k \lambda_i \cdot \prod\limits_{i=1}^k x_i \cdot \prod\limits_{i<j} (x_i-x_j) $$

which is nonzero by assumption.