Recovering eigenvectors from SVD
What you say, that is, the singular values are the absolute values of the eigenvalues, makes only sense for normal matrices, e.g., Hermitian ones. Say, $A\in\mathbb{C}^{n\times n}$ is normal then there exists a unitary $U$ and diagonal $\Lambda$ (with eigenvalues on the diagonal) such that $A=U\Lambda U^*$. The SVD can be obtained simply as $$A=WSV^*=(UD_1^*)(D_1\Lambda D_2^*)(UD_2^*)^*,\tag{1}$$ where $D_1$ and $D_2$ are diagonal matrices such that $D_i^*D_i=1$ (that is, the diagonal elements have unit absolute value) and such that $D_1\Lambda D_2^*$ has non-negative diagonal (for simplicity, I assume the eigen/singular values to be distinct). Therefore, the singular vectors are scalar multiples of the eigenvectors.
Now consider the SVD $A=WSV^*$ is given and let $w_i=We_i$ be the $i$th column of $W$. What happens if you compute the Rayleigh quotient $w_i^*Aw_i$? Using (1), we have $$ \begin{split} w_i^*Aw_i&=e_i^TW^*WSV^*We_i=e_i^T(D_1\Lambda D_2^*)(D_2U^*)(UD_1^*)e_i\\ &=e_i^TD_1\Lambda D_1^*e_i=e_i^TD_1^*D_1\Lambda e_i=e_i^T\Lambda e_i=\lambda_i, \end{split} $$ that is, the Rayleigh quotient associated with the $i$th left (right would work as well) singular vectors gives you the eigenvalue $\lambda_i$. That's not surprising at all since the eigenspace associated with the eigenvalue $\lambda_i$ is the same as the singular vector space associated with the singular value $|\lambda_i|$ and hence the Rayleigh quotients of singular vectors must be equal to the eigenvalues.
This of course does not hold for general matrices but normal ones.
Here is the question I believe Mhenni is asking. This question is more nuanced than I believe the previous discussions have embraced.
I provide in turn an answer in the form of a proof.
Theorem [Mhenni]: A symmetric matrix has the properties that its' eigenvectors and left hand singular vectors are the same and the right hand singular vectors also equal the eigenvectors to within a diagonal matrix product with matrix T, whose diagonal entries are +-1 corresponding to the parity of the eigenvalues of R.
Proof: If R=U L U' , UU'=I, L=diag(lamda_i,i=1,...,N), is the eigendecomposition of an N by N symmetric matrix R, sign indefinite, then R shares the same eigenvectors as R-min(L). This follows trivially from the fact that symmetric matrices satisfy their own characteristic function (Cayle Hamilton theorem). So now that implies in turn that the column of left hand singular vectors EQUALS the eigenvectors of R (since the SVD states that R = V S W' where S is diagonal and positive and since the eigenvalues of R-min(L) are non-negative we invoke the implicit function theorem and are done). Now eigenvectors are not unique. If U is a column of the eigenvectors of R then so is TU. This follows trivially ala' brute force from TU (TU)'= T U U'T'=I. Note that UT is NOT a valid column set of eigenvectors because UT (TU)'= UT U' T' ~=I. Now set the right hand singular vectors of R to TU with the signs set up so S=LT. The result follows trivially
Cheers Allan Steinhardt Chief Scientist, Booz Allen Hamilton