Norm of a symmetric matrix?

Given a symmetric matrix, you have a complete set of eigenvalues and orthogonal eigenvectors. Any vector can be represented as a linear combination of the eigenvectors. Multiply your matrix by an arbitrary unit vector decomposed into the eigenvectors. Then note that the maximum length of the resultant vector is achieved when the input vector is along the eigenvector associated with the largest eigenvalue in absolute value.


Here is a simple explanation not necessarily from linear algebra. We have

$$\|A\|_2=\max_{\|x\|=1}\|Ax\|$$

where $\|\cdot\|$ is simple euclidean norm. This is a constrained optimisation problem with Lagrange function:

$$L(x,\lambda)=\|Ax\|^2-\lambda(\|x\|^2-1)=x'A^2x-\lambda(x'x-1)$$

here I took squares which do not change anything, but makes the following step easier. Taking derivative with respect to $x$ and equating it to zero we get

$$A^2x-\lambda x=0$$

the solution for this problem is the eigenvector of $A^2$. Since $A^2$ is symmetric, all its eigenvalues are real. So $x'A^2x$ will achieve maximum on set $\|x\|^2=1$ with maximal eigenvalue of $A^2$. Now since $A$ is symmetric it admits representation

$$A=Q\Lambda Q'$$

with $Q$ the orthogonal matrix and $\Lambda$ diagonal with eigenvalues in diagonals. For $A^2$ we get

$$A^2=Q\Lambda^2 Q'$$

so the eigenvalues of $A^2$ are squares of eigenvalues of $A$. The norm $\|A\|_2$ is the square root taken from maximum $x'A^2x$ on $x'x=1$, which will be the square root of maximal eigenvalue of $A^2$ which is the maximal absolute eigenvalue of $A$.


Using the spectral theorem to obtain an orthogonal basis of eigenvectors for the matrix is probably the best approach, similar to what Ross said, given your criterion of "simple concepts of linear algebra". Here is another approach I thought worth mentioning.

The largest of the absolute values of the eigenvalues of a matrix $A$ is called the spectral radius of $A$, which I'll denote $r(A)$. In general one allows complex eigenvalues (in part because there are real matrices with no real eigenvalues, like $\begin{bmatrix} 0 & 1 \\ -1 & 0 \end{bmatrix}$), but fortunately it turns out that real symmetric matrices can have only real eigenvalues, so it makes no difference here. The inequality $r(A)\leq \|A\|$ is easy, because if $\lambda$ is an eigenvalue for $A$ and $v$ is a corresponding (nonzero) eigenvector, then $\|Av\|=\|\lambda v\| = |\lambda|\|v\|$, which shows that $\|A\|\geq |\lambda|$.

Remarkably, there is a formula for the spectral radius in terms of the norm, $r(A)=\lim_{k\to\infty}\|A^k\|^{1/k}$. I'll note before going on that everything I've said so far is independent of the norm on the space on which matrices act (that is, independent of which operator norm you choose for the matrices). But now, assuming the Euclidean norm (which is implicit in your question), the matrix norm has the property that $\|A^2\|=\|A\|^2$ for a real symmetric matrix $A$. Inductively you get $\|A^{2^k}\|=\|A\|^{2^k}$, and applying the spectral radius formula yields $r(A)=\lim_{k\to\infty}\|A^{2^k}\|^{1/2^k}=\lim_{k\to\infty}\|A\|=\|A\|$.