The norm of a matrix is defined as \begin{equation} \|A\| = \sup_{\|u\| = 1} \|Au\| \end{equation} Taking the singular value decomposition of the matrix $A$, we have \begin{equation} A = VD W^T \end{equation} where $V$ and $W$ are orthonormal and $D$ is a diagonal matrix. Since $V$ and $W$ are orthonormal, we have $\|V\| = 1$ and $\|W\| = 1$. Then $\|Av\| = \|D v\|$ for any vector $v$. Then we can maximize the norm of $Av$ by maximizing the norm of $Dv$.

By the definition of singular value decomposition, $D$ will have the singular values of $A$ on its main diagonal and will have zeros everywhere else. Let $\lambda_1, \ldots, \lambda_n$ denote these diagonal entries so that

\begin{equation} D = \left(\begin{array}{cccc} \lambda_1 & 0 & \ldots & 0 \\ 0 & \lambda_2 & \ldots & 0 \\ \vdots & & \ddots & \vdots \\ 0 & 0 & \ldots & \lambda_n \end{array}\right) \end{equation}

Taking some $v = (v_1, v_2, \ldots, v_n)^T$, the product $Dv$ takes the form \begin{equation} Dv = \left(\begin{array}{c} \lambda_1v_1 \\ \vdots \\ \lambda_nv_n \end{array}\right) \end{equation} Maximizing the norm of this is the same as maximizing the norm squared. Then we are trying to maximize the sum \begin{equation} S = \sum_{i=1}^{n} \lambda_i^2v_i^2 \end{equation} under the constraint that $v$ is a unit vector (i.e., $\sum_i v_i^2 = 1$). The maximum is attained by finding the largest $\lambda_i^2$ and setting its corresponding $v_i$ to $1$ and then setting each other $v_j$ to $0$. Then the maximum of $S$ (which is the norm squared) is the square of the absolutely largest eigenvalue of $A$. Taking the square root, we get the absolutely largest eigenvalue of $A$.


But the key point is exactly that your matrix is diagonalizable more specially that you can find an orthonormal basis of eigenvector $e_i$ for which you have $A e_i=\lambda_i e_i$.

Then your write for $x=\sum x_ie_i$ and you have $Ax=\sum x_i\lambda_i e_i$ so that $\|Ax\|^2=\sum\lambda_i^2x_i^2$ now by definition of the norm of matrix it gives you $$\frac{\|Ax\|}{\|x\|}\leq|\lambda_{i_0}|$$ therefore $\|A\|\leq|\lambda_{i_0}|$ where $\lambda_{i_0}$ is the greatest eigenvalue . Finally the identity $\|Ae_{i_0}\|=\|\lambda_{i_0}e_i\|$ gives you the inequality $\|A\|\geq|\lambda_{i_0}|$.


I don't have enough reputation to comment. But I'd like to fix a mistake in @yoknapatawpha's answer. The overall idea is correct.

Take the singular value decomposition of $A$ as $$A=VDW^T$$ and then we are looking for $$\max_{||u||=1}||Au||=\max_{||u||=1}||VDW^Tu||.$$

Here V and W are both orthogonal matrices. Because V is orthogonal, we have $$||VDW^Tu||=||DW^Tu||.$$ Now denote $$x=W^Tu,$$ because W is orthogonal, we have $$||x||=||W^Tu||=||u||=1$$ and also there is an one-to-one mapping between $x$ and $u$. Thus the problem is equivalent to $$\max_{||x||=1} ||Dx||$$ and the rest of the answer follows @yoknapatawpha's.

However, generally it's not right to say $\|Av\| = \|D v\|$ is true for any $v$. For example when $$V=\left[\begin{matrix}1& 0\\0 & 1\end{matrix}\right], D=\left[\begin{matrix}2& 0\\0 & 1\end{matrix}\right], W=\left[\begin{matrix}0& 1\\1 & 0\end{matrix}\right], v=\left[\begin{matrix}1\\0\end{matrix}\right]$$ in this case you can check that $||Av||=1$ but $||Dv||=2$.