Cauchy-Schwarz inequality proof (but not the usual one)

If I remember correctly, Cauchy proved the inequality that you don't want proven. The generalisation that you do want proven was proven later by Schwarz. So, I think that the former inequality ought to be called the Cauchy inequality. Oh well.

The proof of the (general) Cauchy-Schwarz inequality essentially comes down to orthogonally decomposing $x$ into a component parallel to $y$ and a component perpendicular to $y$, and using the fact that the perpendicular component has a non-negative square norm. That is, we start from the statement: $$\left\|x - \frac{\langle x, y \rangle}{\langle y, y \rangle} y\right\|^2 \ge 0.$$ The rest is just expanding the inner product. We have, \begin{align*} 0 &\le \left\langle x - \frac{\langle x, y \rangle}{\langle y, y \rangle} y, x - \frac{\langle x, y \rangle}{\langle y, y \rangle} y\right\rangle \\ &= \langle x, x \rangle - \frac{\overline{\langle x, y \rangle}}{\langle y, y \rangle} \langle x, y \rangle - \frac{\langle x, y \rangle}{\langle y, y \rangle} \langle y, x \rangle + \frac{\langle x, y \rangle}{\langle y, y \rangle} \frac{\overline{\langle x, y \rangle}}{\langle y, y \rangle} \langle y, y\rangle \\ &= \langle x, x \rangle - 2\frac{|\langle x, y \rangle|^2}{\langle y, y \rangle} + \frac{|\langle x, y \rangle|^2}{\langle y, y \rangle} \\ &= \langle x, x \rangle - \frac{|\langle x, y \rangle|^2}{\langle y, y \rangle}. \end{align*} Rearranging yields the inequality.


Since $\langle tx+y,tx+y\rangle\ge0$ for all $t$, $\;\;\langle x,x\rangle t^2+2\langle x,y\rangle t+\langle y, y\rangle\ge0$ for all $t$, so

$(2\langle x,y\rangle)^2-4\langle x,x\rangle\langle y,y\rangle\le0\implies \langle x,y\rangle^2\le\langle x,x\rangle\langle y,y\rangle=\lvert\lvert x\rvert\rvert^2 \lvert\lvert y\rvert\rvert^2$.


Hint: $\|x-y\|^2=\langle x-y,x-y\rangle=\langle x,x\rangle-2\langle x,y\rangle+\langle y,y\rangle=\|x\|^2-2\langle x,y\rangle+\|y\|^2$.


A proof of the inequality mimicks the proof used for $\mathbf R^n$: for $\lambda\in \mathbf R$ consider the inner product: $$\langle\lambda x-y,\lambda x-y\rangle=\lambda^2\langle x,x\rangle-2\lambda\langle x,y\rangle+\langle\mkern1mu y,y\rangle $$ This is a quadratic polynomial in $\lambda$, which is nonnegative for every $\lambda$, hence its reduced discriminant is $\le 0$. Now $\langle x,x\rangle=\lVert x\rVert^2$ and similarly for $y$, so: $$\Delta'=\langle x,y\rangle^2- \langle x,x\rangle\langle\mkern1mu y,y\rangle\le0\iff\langle x,y\rangle^2\le\lVert x\rVert^2\lVert y\rVert^2\iff\lvert\langle x,y\rangle\rvert\le \lVert x\rVert \lVert y\rVert$$ This proves the inequality.

Furthermore, if the inequality is an equality, i.e. if $\Delta'=0$, the quadratic polynomial has a double root, which is equal to $\lambda=\dfrac{\langle x,y\rangle}{\langle x,x\rangle}$; in other words $$\Bigl\langle\frac{\langle x,y\rangle}{\langle x,x\rangle}x-y,\langle\frac{\langle x,y\rangle}{\langle x,x\rangle}x-y\Bigr\rangle=0$$

and as an inner product is a definite positive bilinear form, this implies $$y=\frac{\langle x,y\rangle}{\langle x,x\rangle}x. $$ The converse (if $x$ and $y$ are linearly dependent, the inequality is an equality) is trivial.