We can generalize matrix inverses from non-singular square matrices to rectangular matrices in general, for example, the well-known Moore–Penrose pseudoinverse. I am wondering how this can be done for eigenvalues and eigenvectors.
Though $\det(|A-\lambda I|)=0$ cannot be used any more when $A$ is not square, there is nothing that prevents one to consider $Av=\lambda v$ for non-zero vector $v$ except the possibility of having an inconsistent linear system.
Please give your comments and provide some references if there are some.
Many thanks.
Edit
I know SVD. But it does not seem to be the one I wanted. For SVD of real matrix $A$, $A=UDV^T$ where $U, V$ are orthogonal matrices and $D$ is diagonal (with possibly zeros in the diagonal). We only have $AV_{*k}=\sigma_{k}U_{*k}$, $V_{*k}$ is the $k^\text{th}$ column of $V$. Since $V_{*k}$ and $U_{*k}$ are in general different, it does not resemble $Av=\lambda v$ for non-zero vector $v$ in the definition of eigenvectors. Also, even if we can have $A^TAV_{*k}=\lambda V_{*k}$, but this is for the (square) matrix $A^TA$, rather than $A$ itself.