Expressing a matrix as an expansion of its eigenvalues
The proof using $AU = U\Lambda$ is not tedious. Since the $U$ is orthogonal, you have $U^{-1} = U^T$, so $A = U \Lambda U^T$. Then $$Ax = U \Lambda U^T x = U \Lambda \begin{bmatrix} u_1^T x \\ \vdots \\ u_n^T x \end{bmatrix} = U \begin{bmatrix} \lambda_1 u_1^T x \\ \vdots \\ \lambda_n u_n^T x \end{bmatrix} = \sum_k (\lambda_k u_k^T x) u_k = \sum_k \lambda_k u_k u_k^T x = (\sum_k \lambda_k u_k u_k^T)x$$
Hence $A=\sum_k \lambda_k u_k u_k^T$.
Since $AU = U\Lambda$, inverting both sides gives $U^T A^{-1} = \Lambda^{-1} U^T$, and hence $A^{-1} = U\Lambda^{-1} U^T$. Applying the above result to $A^{-1}$, noting that $\Lambda^{-1}$ is just the diagonal matrix of the inverses of the diagonal elements of $\Lambda$, we have $A^{-1} = \sum_k \frac{1}{\lambda_k} u_k u_k^T$.
To address your other question, the same result holds for Hermitian matrices ($A^* = A$), with the proviso that the $U$ will be unitary rather than orthogonal (ie, may be complex).
A normal matrix ($A A^* = A^* A$) can also be expressed as above, except the eigenvalues may be complex (and eigenvectors, of course)
The matrix $\begin{bmatrix} 0 & 1 \\ 0 & 0 \end{bmatrix} $ is real, but not symmetric, but does not have a basis of eigenvectors (hence it cannot be expressed as above).
The matrix $\begin{bmatrix} 0 & i \\ i & 0 \end{bmatrix} $ is symmetric but not real (it is normal). It can be unitarily diagonalized, but the eigenvalues and eigenvectors are complex.
The official solution (from the Tutor's edition of the solutions manual) is as follows (the matrix is called $\Sigma$ instead of $A$):
We can rewrite the r.h.s. of (2.48) in matrix form as $$\sum_{i=1}^D \lambda_i \mathbf{u}_i \mathbf{u}_i^T = \mathbf{U} \mathbf{\Lambda} \mathbf{U}^T = \mathbf{M}$$
where $\mathbf{U}$ is a $D \times D$ matrix with the eigenvectors $\mathbf{u}_1, \dots, \mathbf{u}_D$ as its columns and $\mathbf{\Lambda}$ is a diagonal matrix with the eigenvalues $\lambda_1, \dots, \lambda_D$ along its diagonal.
Thus we have
$$ \mathbf{U}^T \mathbf{M} \mathbf{U} = \mathbf{U}^T \mathbf{U} \mathbf{\Lambda} \mathbf{U}^T \mathbf{U} = \mathbf{\Lambda}$$
However, from (2.45)-(2.47), we also have that
$$ \mathbf{U}^T \mathbf{\Sigma} \mathbf{U} = \mathbf{U}^T \mathbf{\Lambda} \mathbf{U} = \mathbf{U}^T \mathbf{U} \mathbf{\Lambda} = \mathbf{\Lambda}$$
and so $\mathbf{M} = \mathbf{\Lambda}$ and (2.48) holds.
Moreover, since $\mathbf{U}$ is orthonormal, $\mathbf{U}^{-1} = \mathbf{U}^T$ and so:
$$ \mathbf{\Sigma}^{-1} = (\mathbf{U} \mathbf{\Lambda} \mathbf{U}^T)^{-1} = (\mathbf{U}^T)^{-1} \mathbf{\Lambda}^{-1} \mathbf{U}^{-1} = \mathbf{U} \mathbf{\Lambda}^{-1} \mathbf{U}^T = \sum_{i=1}^D \dfrac{1}{\lambda_i} \mathbf{u}_i \mathbf{u}_i^T $$
Where (2.45) is
$$ \mathbf{\Sigma} \mathbf{u}_i = \lambda_i \mathbf{u}_i $$
So that $ \mathbf{\Sigma} \mathbf{U} = \mathbf{\Lambda} \mathbf{U} $ is the eigenvector equation