Rudin Theorem 1.35 - Cauchy Schwarz Inequality

He does it because it works. Essentially, as you see, $$\sum_{j=1}^n |Ba_j-Cb_j|^{2}$$ is always greater or equal to zero. He then shows that $$\tag 1 \sum_{j=1}^n |Ba_j-Cb_j|^{2}=B(AB-|C|^2)$$

and having assumed $B>0$; this means $AB-|C|^2\geq 0$, which is the Cauchy Schwarz inequality.

ADD Let's compare two different proofs of Cauchy Schwarz in $\Bbb R^n$.

PROOF1. We can see the Cauchy Schwarz inequality is true whenever ${\bf x}=0 $ or ${\bf{y}}=0$, so discard those. Let ${\bf x}=(x_1,\dots,x_n)$ and ${\bf y }=(y_1,\dots,y_n)$, so that $${\bf x}\cdot {\bf y}=\sum_{i=1}^n x_iy_i$$

We wish to show that $$|{\bf x}\cdot {\bf y}|\leq ||{\bf x}||\cdot ||{\bf y}||$$

Define $$X_i=\frac{x_i}{||{\bf x}||}$$

$$Y_i=\frac{y_i}{||{\bf y}||}$$

Because for any $x,y$ $$(x-y)^2\geq 0$$ we have that $$x^2+y^2\geq 2xy$$ Using this with $X_i,Y_i$ for $i=1,\dots,n$ we have that $$X_i^2 + Y_i^2 \geqslant 2{X_i}{Y_i}$$

and summing up through $1,\dots,n$ gives $$\eqalign{ & \frac{{\sum\limits_{i = 1}^n {y_i^2} }}{{||{\bf{y}}|{|^2}}} + \frac{{\sum\limits_{i = 1}^n {x_i^2} }}{{||{\bf{x}}|{|^2}}} \geqslant 2\frac{{\sum\limits_{i = 1}^n {{x_i}{y_i}} }}{{||{\bf{x}}|| \cdot ||{\bf{y}}||}} \cr & \frac{{||{\bf{y}}|{|^2}}}{{||{\bf{y}}|{|^2}}} + \frac{{||{\bf{x}}|{|^2}}}{{||{\bf{x}}|{|^2}}} \geqslant 2\frac{{\sum\limits_{i = 1}^n {{x_i}{y_i}} }}{{||{\bf{x}}|| \cdot ||{\bf{y}}||}} \cr & 2 \geqslant 2\frac{{\sum\limits_{i = 1}^n {{x_i}{y_i}} }}{{||{\bf{x}}|| \cdot ||{\bf{y}}||}} \cr & ||{\bf{x}}|| \cdot ||{\bf{y}}|| \geqslant \sum\limits_{i = 1}^n {{x_i}{y_i}} \cr} $$

NOTE How may we add the absolute value signs to conclude?

PROOF2

We can see the Cauchy Schwarz inequality is true whenever ${\bf x}=0 $ or ${\bf{y}}=0$, or $y=\lambda x$ for some scalar. Thus, discard those hypotheses. Then consider the polynomial (here $\cdot$ is inner product) $$\displaylines{ P(\lambda ) = \left\| {{\bf x} - \lambda {\bf{y}}} \right\|^2 \cr = ( {\bf x} - \lambda {\bf{y}})\cdot({\bf x} - \lambda {\bf{y}}) \cr = {\left\| {\bf x} \right\|^2} - 2\lambda {\bf x} \cdot {\bf{y}} + {\lambda ^2}{\left\| {\bf{y}} \right\|^2} \cr} $$

Since ${\bf x}\neq \lambda{\bf y}$ for any $\lambda \in \Bbb R$, $P(\lambda)>0$ for each $\lambda\in\Bbb R$. It follows the discriminant is negative, that is $$\Delta = b^2-4ac={\left( {-2\left( {{\bf x} \cdot y} \right)} \right)^2} - 4{\left\| {\bf x} \right\|^2}{\left\| {\bf{y}} \right\|^2} <0$$ so that $$\displaylines{ {\left( {{\bf x}\cdot {\bf{y}}} \right)^2} <{\left\| {\bf x} \right\|^2}{\left\| {\bf{y}} \right\|^2} \cr \left| {{\bf x} \cdot {\bf{y}}} \right| <\left\| {\bf x}\right\| \cdot \left\| {\bf{y}} \right\| \cr} $$ which is Cauchy Schwarz, with equaliy if and only if ${\bf x}=\lambda {\bf y}$ for some $0\neq \lambda \in\Bbb R$ or either vector is null.


One proof shows the Cauchy Schwarz inequality is a direct consequence of the known fact that $x^2\geq 0$ for each real $x$. The other is shorter and sweeter, and uses the fact that a norm is always nonnegative, and properties of the inner product of vectors in $\Bbb R^n$, plus that fact that a polynomial in $\Bbb R$ with no real roots must have negative discriminant.