Orthogonal set and linear independence and proofs.

Could you help me with this, this is in all likelihood a duplicate question but the matter is not about the solution it is about if my solution is correct, whenever I am proving something in Linear Algebra, I always have this feeling that either my proof is incorrect or a more sophisticated proof exists.

For example, take the question given a set of orthonormal vectors $(x_1,\dots, x_n)$ prove that they are linearly independent.

My proof -Let them be linearly dependent and orthogonal, then $\langle x_i, x_j\rangle = \delta_{ij}$, and $a_1x_1 +\dots + a_n x_n =0$ where not all the $a_i$ are zero.

Then $$(-a_1/a_j) x_1+\dots+(-a_{j-1}/a_j) x_{j-1} +(-a_{j+1}/a_j)a_{j+1} +\dots+(-a_n/a_j)x_n = x_j.$$ Now $$\langle x_j, x_n\rangle =(-a_1/a_j) \langle x_1,x_n\rangle +\dots+(-a_n/a_j)\langle x_j, x_n\rangle = -a_n/a_j,$$ this set is independent.

My question, is this proof incorrect/more sophisticated proofs exist? and how do I master proofs in general?


Solution 1:

$(V, <•, •>) $ be a inner product space.

Suppose, $S= \{e_1 ,e_2,...,e_n\}$ be a orthonormal set of vectors in $V$.

Claim: $S\subset V$ Linearly independent.

To prove a set of vectors is linearly independent we have to show that the zero vector can be written as a unique linear combination of the vectors in the set. (No non trivial combination exists)

Let,$$ 0=\sum_{k=0}^{n} c_k e_k$$

\begin{align} 0=||\sum_{k=0}^{n} c_k e_k||^2 &=< \sum_{k=0}^{n} c_k e_k,\sum_{k=0}^{n} c_k e_k>\\ &= \sum_{k=0}^{n} |c_k|^2 \end{align}

Hence, $c_k =0 \space \space \forall k\in {\mathbb{N}_n}$