0
$\begingroup$

I have a random matrix (linearly independent rows and columns). I want to linearly scale the matrix such that the eigen values are between -1 and 1. Is it possible ?. If so, How can I do it ?

  • 1
    If $\lambda$ is an eigenvalue of $A$, then $\mu\lambda$ is an eigenvalue of $\mu A$. On the other hand, $A$ may have both real and complex eigenvalues---for example, $\pmatrix{1&&\\&&-1\\&1&}$---in which case all of the eigenvalues $\mu \lambda$ are real iff $\mu = 0$.2017-02-23
  • 1
    As @Travis notes, your matrix may have nonreal eigenvalues. At that point, you either have to give up, or you have to settle for scaling so the eigenvalues have modulus at most $1$.2017-02-23
  • 0
    On the other hand, my first comment tells you that if all of your eigenvalues are real (and not all are zero), you can scale $A$ by $|\nu|^{-1}$, where $\nu$ is the eigenvalue of the largest absolute value.2017-02-23
  • 1
    @Travis, of course that assumes you *know* the eigenvalue of largest absolute value. Perhaps OP wants something like, "divide by the sum of the absolute values of the entries in the matrix", something you can do without finding the eigenvalues first.2017-02-23

0 Answers 0