Showing that $x^{\top}Ax$ is maximized at $\max \lambda(A)$ for symmetric $A$
There is a cleaner proof altogether that circumvents the need to consider the square root of $A$. Consider the spectral decomposition of $A$ given by $A = V\Lambda V^{\top}$. For some $x \in \mathbb{R}^n$, define $\tilde{x} = V^{\top}x$. Then,
$$ x^{\top}Ax = x^{\top}V\Lambda V^{\top}x = \tilde{x}^{\top}\Lambda \tilde{x} = \sum_{i=1}^n \lambda_i \tilde{x}_i^2. $$
Clearly,
$$ \min \lambda(A) \sum_{i=1}^n \tilde{x}_i^2 \leq \sum_{i=1}^n \lambda_i \tilde{x}_i^2 \leq \max \lambda(A)\sum_{i=1}^n \tilde{x}_i^2 $$
from which it follows that
$$ \min \lambda(A) \leq x^{\top}Ax \leq \max \lambda(A). $$
(Note that since $V$ is an orthogonal matrix, $x = U\tilde{x}$, and so the above argument applies for any $x \in \mathbb{R}^n$.)
Yes and yes.
The fix would be to add a multiple of the identity such that $A+cI$ is positive definite, since this just shifts the claimed equation by $c$ on both sides. (I didn't check your proof in detail; it seems you're missing square roots where it says $\lambda(A^\top A)$?)
The more concise proof would be to write
$$ x^\top Ax=x^\top V\Lambda V^\top x=(V^\top x)^\top\Lambda V^\top x=\sum_i\Lambda_i(V^\top x)_i^2 $$
and note that since $V^\top$ is unitary, $V^\top x$ runs through all unit vectors, and the sum is clearly maximized for $(V^\top x)_i=\delta_{i,i_\text{max}}$.