4
$\begingroup$

We can generalize matrix inverses from non-singular square matrices to rectangular matrices in general, for example, the well-known Moore–Penrose pseudoinverse. I am wondering how this can be done for eigenvalues and eigenvectors.

Though $\det(|A-\lambda I|)=0$ cannot be used any more when $A$ is not square, there is nothing that prevents one to consider $Av=\lambda v$ for non-zero vector $v$ except the possibility of having an inconsistent linear system.

Please give your comments and provide some references if there are some.

Many thanks.

Edit

I know SVD. But it does not seem to be the one I wanted. For SVD of real matrix $A$, $A=UDV^T$ where $U, V$ are orthogonal matrices and $D$ is diagonal (with possibly zeros in the diagonal). We only have $AV_{*k}=\sigma_{k}U_{*k}$, $V_{*k}$ is the $k^\text{th}$ column of $V$. Since $V_{*k}$ and $U_{*k}$ are in general different, it does not resemble $Av=\lambda v$ for non-zero vector $v$ in the definition of eigenvectors. Also, even if we can have $A^TAV_{*k}=\lambda V_{*k}$, but this is for the (square) matrix $A^TA$, rather than $A$ itself.

  • 3
    But isn't the problem that when you write $Av$, this is a vector of different dimension than $v$ itself? Eigenvectors are kind like fixed points. If you think of a (non-singular) linear map as really acting on projective space, the eigenvectors are just the fixed points of the map. But fixed points are all about mapping a space to itself. If you have an $m \times n$ matrix... it is a linear map from one space to a completely different space. With that said, I'm sure there is something one can say about some kind of analogous ideas, but I do not know about them.2011-04-19
  • 1
    As a concrete example, if you multiply a 4-by-3 matrix and a 3-vector, you get a 4-vector. I thus don't see how you can make sense of $\mathbf A\mathbf v=\lambda \mathbf v$ if $\mathbf A$ is non-square...2011-04-19

2 Answers 2