If $v$ is an eigenvector of $A^T A$ with corresponding eigenvalue $\lambda$ then $$ \lambda \|v\|^2 = \lambda \langle v, v \rangle = \langle \lambda v, v \rangle = \langle A^T A v, v \rangle = \langle Av, Av \rangle = \|Av\|^2. $$ Because $\|v\|^2$ and $\|Av\|^2$ are nonnegative, so must $\lambda$ be. The same argument shows that the eigenvalues of $AA^T$ are nonnegative. So those particular eigenvalues are going to be nonnegative.

It's certainly also possible to express matrices in the form $U \Sigma V^T$ with $U$ and $V$ real orthogonal and $\Sigma$ diagonal with entries that are not all negative. For example in the $1 \times 1$ case we have $1 = (-1) \cdot (-1) \cdot 1$ and in the $2 \times 2$ case we have $$ \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix} = \begin{pmatrix} -1 & 0 \\ 0 & 1 \end{pmatrix} \begin{pmatrix} -1 & 0 \\ 0 & 1 \end{pmatrix} \begin{pmatrix} 1 & 0 \\ 0 & 1 \end{pmatrix}^T$$ and there are many other variations and generalizations of this theme. Depending on your application, these sorts of matrix factorizations might be just as useful for some specific purpose as the SVD. But the SVD does show that it's possible to have a factorization where the diagonal matrix $\Sigma$ has nonnegative entries. Speaking informally, if you have such a factorization where $\Sigma$ is diagonal but does not have all nonnegative entries, you can "absorb" any negative signs into the choice of either $U$ or $V$. And for some applications it is useful to be able to do so. I hope this helps.