Minimum of a quadratic form
If $\bf{A}$ is a real symmetric matrix, we know that it has orthogonal eigenvectors. Now, say we want to find a unit vector $\bf{n}$ that minimizes the form: $${\bf{n}}^T{\bf{A}}\ {\bf{n}}$$ How can one prove that this vector is given by the eigenvector corresponding to the minimum eigenvalue of $\bf{A}$?
I have a proof of my own, but it's rather unelegant - how would you go about proving this?
Solution 1:
Proof that the minimum value of the quadratic form $n^T A n$ is the minimum eigenvalue of a real symmetric matrix $A$ for a unit vector $n$:
Let $A=UDU^T$ be its eigen decomposition. Then $D$ is a diagonal matrix with all the eigenvalues as diagonal entries. Let $D_{ii} = \lambda_i$Then we have \begin{align} \min_{|n|=1}~n^TAn\\&=\min_{|n|=1}~n^T(UDU^T)n\\&=\min_{|n|=1}~(U^Tn)^TD(U^Tn)\\&=\min_{|U^Tn|=1}(U^Tn)^TD(U^Tn) ~~~\tag{1} \\ &=\min_{|y|=1}~y^TDy, y:=U^Tn\\ &=\min_{|y|=1}\sum_{i}y_i^2D_{ii}\\ &=\min_{|y|=1}\sum_{i}y_i^2 \lambda_{i}~~~~~~~~~ \end{align}
Let $\lambda_j := \min \{\lambda_1, ..., \lambda_n\} = \min_i \{\lambda_i\}$ for some $j = 1, ..., n$.
Observe the following lower bound for the quadratic form
$$\sum_{i}y_i^2 \lambda_{i} \ge \sum_{i}y_i^2 \lambda_j = \lambda_j \sum_{i}y_i^2 = \lambda_j (1) = \lambda_j \tag{2}$$
This is actually the greatest lower bound, i.e. infimum, because the RHS of $(2)$ does not depend on $y$: for any $y$, the quadratic form is greater than or equal to $\lambda_j$.
This is actually the minimum because the quadratic form can achieve the infimum for $y$ s.t. $y_j=1$ and $y_i=0$ for the $i$'s that aren't $j$ ($\forall \ i \ne j$).
QED
By the way if $A^T=A^{-1}$, then $\lambda_j=-1$ because Can we prove that the eigenvalues of an $n\times n$ orthogonal matrix are $\pm 1$ by only using the definition of orthogonal matrix?
$(1)$ Observe that $|U^Tn|=1$ because $|n|=1$.
Pf: $$|U^Tn|=(U^Tn)^T U^Tn = n^T U U^T n = n^T (I) n = n^T n = |n| = 1$$ QED
Solution 2:
Let $A=UDU^T$ be its eigen decomposition. Then $D$ is a diagonal matrix with all the eigenvalues as diagonal entries. Then we have \begin{align} \min_{n^Tn=1}~n^TAn\\&=\min_{u^Tu=1}u^TDu ~~~\{u=Un\} \\ &=\min_{x\in\mathbb{S}}\sum_{i}x_iD_{ii}~~~~~~~~~ \end{align} where $$\mathbb{S}=\{(x_1,\dots,x_N)\in\mathbb{R}^N\mid x_i\geq 0,~~\sum_{i}x_i=1\}$$ The last step is equivalent to \begin{align} \min_{x\in\mathbb{S}}\sum_{i}x_iD_{ii}=\min_{i}D_{ii} \end{align} and also note that \begin{align} \max_{x\in\mathbb{S}}\sum_{i}x_iD_{ii}=\max_{i}D_{ii} \end{align}
Solution 3:
For a symmetric matrix A, there exists a diagonal matrix D and an orthonormal matrix S such that $S^{-1}AS=D $ where the diagonal entries of D are the eigenvalues of A and the rows of S are the eigenvectors of A. so if we let $n $ be the eigenvector corresponding to the smallest eigenvalue of A then $n^TAn=\lambda $
To push this proof further;
A unit eigenvector $v$ with corresponding eigenvalue $\lambda $ satisfies;
$Av=\lambda v \Rightarrow v^TAv=\lambda$