This is an immediate consequence of Cauchy-Schwarz inequality. The Cauchy-Schwarz inequality states that if $\vec{a},\vec{b} \in \mathbb{R}^m$, then $$\left \vert \vec{a} \cdot \vec{b} \right \vert \leq \left \Vert \vec{a} \right \Vert \left \Vert \vec{b} \right \Vert$$ where equality holds iff $\vec{a}=\lambda \vec{b}$, where $\lambda \in \mathbb{R}$. In your case, take $\vec{a} = \begin{bmatrix} x_1 & x_2 & x_3 & \cdots & x_n\end{bmatrix}^T$ and $\vec{b} = \begin{bmatrix} 1 & 1 & 1 & \cdots & 1\end{bmatrix}^T$.


There are a lot of ways you can prove this. It is called QM-AM inequality (you may often see it stated as holding for $x_i\ge 0$, but then it clearly holds for any real $x_i$, since $|x_i|\ge x_i$).

In general, $a_i>0, k_2>k_1$ gives (Power Mean Inequality): $$\sqrt[k_2]{\frac{a_1^{k_2}+\cdots+a_n^{k_2}}{n}}\ge \sqrt[k_1]{\frac{a_1^{k_1}+\cdots+a_n^{k_1}}{n}}$$

$1)$ Cauchy-Schwarz inequality. Multiply both sides by $n^2$: $$(1+1+\cdots+1)(x_1^2+x_2^2+\cdots+x_n^2)\ge (x_1+x_2+\cdots+x_n)^2$$

$2)$ Jensen's inequality. Let $f(x)=x^2$. $f$ is convex (since $f''(x)=2\ge 0$), so $$\frac{f(x_1)+f(x_2)+\cdots+f(x_n)}{n}\ge f\left(\frac{x_1+x_2+\cdots+x_n}{n}\right)$$

$3)$ Chebyshev's inequality. Let wlog $x_1\ge x_2\ge \cdots \ge x_n$. Multiply both sides by $n^2$: $$n(x_1x_1+x_2x_2+\cdots+x_nx_n)\ge (x_1+x_2+\cdots+x_n)(x_1+x_2+\cdots+x_n)$$


Let $X$ be a r.v. uniform on $\{x_1,x_2, \ldots,x_n \}$.

Then notice that $$ V(X)=\dfrac{x_1^2+ x_2^2 + \cdots + x_n^2}n -\left(\dfrac{x_1+x_2+\cdots+x_n}n\right)^2 . $$ Since the variance is always non-negative, we conclude.


It's just $$\sum_{1\leq i<j\leq n}(x_i-x_j)^2\geq0$$