The singular values of a matrix $A$ (not necessarily square) are the square roots of the eigenvalues of $A^TA$. Squaring $S$ will NOT give the eigenvalues of $A$, but of $A^TA$. Additionally, the singular vectors (the columns of $U$ and $V$) are the eigenvectors of $A^TA$ and $AA^T$. A priori, there is no direct connection between the eigenvalues of $A$ and the singular values of $A$, or between eigenvectors and singular vectors.
Of course, there are some relations. Given a diagonal matrix, it scales the length of a vector by at most the largest entry in the matrix. Because orthogonal matrices preserve length, the largest singular value is the largest amount by which the length of a vector can be scaled. If $A$ is symmetric, the same description holds of the largest eigenvalue. However, in general, if $\lambda$ is the largest eigenvalue of $A$, then in general $|\lambda|$ will be smaller than the largest singular value. For example, with the matrix
$ \pmatrix{1 & 2\\ 3 & 4}$
the eigenvalues are $5.372$ and $-.372$, while the singular values are $5.465$ and $.366$.
If $A$ is symmetric, the eigen-decomposition is an SVD decomposition. Unfortunately, I know of no good way to go from one decomposition to another in general.
There is a nice relationship for normal matrices, that is matrices which satisfy $AA*=A*A$. (or in the case of real matrices, $AA^T=A^TA$). A matrix is normal if and only if it can be diagonalized by a unitary matrix, that is $A=USU^*$ where $UU^*=I$ and $S$ is diagonal. In this case, $AA^*=(USU^*)(US^*U^*)=U(SS^*)U^*$. In this case, $SS^*$ will be diagonal and will contain the squares of the lengths of the eigenvalues of $A$, and therefore the singular values of $A$ are in fact the absolute values of the eigenvalues of $A$.
Unfortunately, this makes it easy to go from an eigenvalue decomposition of $A$ to an SVD, but not the other way around.