Are the eigenvectors of a real symmetric matrix always an orthonormal basis without change?
I was reading the wikipedia page for symmetric matrices, and I noticed this part:
a real n×n matrix A is symmetric if and only if there is an orthonormal basis of Rn consisting of eigenvectors for A
Does this mean the eigenvectors of a symmetric matrix with real values always form an orthonormal basis, meaning that without changing them at all, they're always orthogonal and always have a norm of 1?
Or does it mean that based on the eigenvectors, we can manipulate them (e.g. divide them by their current norm) and turn them into vectors with a norm of 1?
There is no "the" eigenvectors for a matrix. That's why the statement in Wikipedia says "there is" an orthonormal basis...
What is uniquely determined are the eigenspaces. But you can make different choices of eigenvectors from the eigenspaces and make them orthogonal or not (and of course you can go in and out of "orthonormal" by multiplicating by scalars). In the special case where all the eigenvalues are different (i.e. all multiplicities are $1$) then any set of eigenvectors corresponding to different eigenvalues will be orthogonal.
As a side note, there is a small language issue that appears often. This is that matrices have eigenvalues, but to talk about eigenvectors you are seeing your matrix as a linear operator on a vector space (and that's of course where the notion of eigenvalue comes from)
To see a concrete example, consider the matrix $$ \begin{bmatrix}1&0&0\\ 0&0&0\\ 0&0&0\end{bmatrix} $$ The orthonormal basis the Wikipedia article is talking about is $\begin{bmatrix}1\\0\\0\end{bmatrix}$, $\begin{bmatrix}0\\1\\0\end{bmatrix}$, $\begin{bmatrix}0\\0\\1\end{bmatrix}$.
But as the multiplicity of zero as eigenvalue is $2$, we can choose a different basis for its eigenspace, and then $\begin{bmatrix}1\\0\\0\end{bmatrix}$, $\begin{bmatrix}0\\1\\1\end{bmatrix}$, $\begin{bmatrix}0\\2\\1\end{bmatrix}$ is another (not orthogonal) basis of eigenvectors.
Finally, if you don't want a basis, you can have an infinity of eigenvectors: for instance all vectors of the form $\begin{bmatrix}t\\0\\0\end{bmatrix}$, for any $t$, are eigenvectors. And all vectors $\begin{bmatrix}0\\t\\s\end{bmatrix}$, for any $t$ and $s$, are eigenvectors.
There are two different things that might go wrong:
1) Eigenvectors can always be scaled. So if $v$ is an eigenvector then so is $av$ for $a\in k^\ast$. So the length one part is not automatic but can be forced easily - as you said yourself by dividing each eigenvector by it's length.
2) More importantly linear independent eigenvectors to the same eigenvalue do not need to be orthogonal. What is true however is that two eigenvectors to different eigenvalues of a symmetric matrix are orthogonal. So if each eigenvalue has multiplicity one a basis of eigenvectors is automatically orthogonal (and can be made orthonormal as above). In general we need to find an orthogonal basis of each eigenspace first, e.g. by Gram-Schmidt.
Edit: Part two is illustrated in @Martin's answer. The eigenvectors to the eigenvalue $1$ are always orthogonal to the eigenvectors to the eigenvalue $0$. However we can choose multifarious non-orthogonal bases of the eigenspace to $0$.
No, if $v$ is an eigenvector then $\alpha v$ is also an eigenvector for any $\alpha\in F$.
In order for $v$ to be a part of an orthormal basis it must hold that $||v||=1$, but multiplying in $\alpha$ changes the norm of the vector