Solution 1:

There seems to be several points of confusion here, so let me ignore your questions for now and start from the top.

The eigenvalues of a matrix $A$ are defined as the set of values $\lambda$ for which the matrix $A-\lambda I$ is singular. Put another way, the eigenvalues of the matrix $A$ are the set of values $\lambda$ for which $$p(\lambda) = \det(A-\lambda I) = 0$$ The expression $\det(A-\lambda I)$ is called the characteristic polynomial of $A$ and the eigenvalues are defined to be the roots of this polynomial. In general, the characteristic polynomial of an $n\times n$ matrix is an $n$th degree polynomial which means that there will be (at most) $n$ roots of the polynomial. The set of eigenvalues is what we call the spectrum of $A$. The spectrum is the set of values which appears on the diagonal of your diagonal matrix. These values are unique but only up to order.

So let me now address your question: "Are the eigenvalues of a matrix unique?" Well that's a bit difficult to answer because the question is not formulated well. If a matrix has $n$ distinct eigenvalues, would you consider each eigenvalue to be unique (in the sense of multiplicity one)? If a matrix has only a single eigenvalue of multiplicity $n$, would you consider that to be unique?

In either case, the answer to your question would be no. A matrix does not necessarily have distinct eigenvalues (although almost all do), and a matrix does not necessarily have a single eigenvalue with multipicity $n$. In fact, given any set of $n$ values, you can construct a matrix with those values as eigenvalues (indeed just take the corresponding diagonal matrix).

Now onto eigenvectors. For each eigenvalue $\lambda$, there exists a subspace of vectors $E_\lambda$ which satisfies the equation $$A\mathbb{v} = \lambda\mathbb{v}$$ for $\mathbb{v}\in E_\lambda$. Now this eigenspace $E_\lambda$ is unique, but the vectors in the space, the eigenvectors are not unique. It is analogous to the fact that you can talk about there being a unique $x$-axis, but it makes no sense to talk about a unique point on the $x$-axis.

What is true is that the eigenspaces of different eigenvalues are independent, so that eigenvectors of different eigenvalues are linearly independent. When your matrix is diagonalizable, the collection (or direct sum if you are familiar with the term) of these eigenspaces is your entire vector space. This means that there exists a basis of solely eigenvectors and your matrix $S$ is formed from a basis of eigenvectors as its columns. Of course, each eigenspace is in fact a subspace, so linear combinations of eigenvectors remain eigenectors. This is why you can multiply your eigenvectors by scalar multiples and still have them remain eigenvectors. In fact, you are free to choose any basis for the eigenspace and your matrix $S$ will correspondingly be modified.

Solution 2:

1.Given a matrix, the superset (a set that allows multiple instances of an element) of eigenvalues is unique. It implies that you can not find a different superset of eigenvalues for a matrix.

2.Eigenvectors corresponding to eigenvalues of single multiplicity are parametrized by a coefficient which is denoted by $c.\mathbf{x}$. The unit eigenvector $\mathbf{x}$ is unique up to sign (it can be multiplied by -1) for this case.

3.For eigenvalues of multiplicity $n$, the dimension of eigenspace $m$ is such that $m \leq n$. For this case, we can find different sets of unit vectors which span the same eigenspace. In other words, set of eigenvectors is not unique for this case. (Here, we assume that $m>1$, i.e if $m=1$, the unit eigenvector is unique up to sign as noted in 2.