4
$\begingroup$

In general I know that the eigenvalues of A are not the same as U for the decomposition but for one matrix I had earlier in the year it was. Is there a special reason this happened or was it just a coincidence? The matrix was

$A = \begin{bmatrix}-1& 3 &-3 \\0 &-6 &5 \\-5& -3 &1\end{bmatrix}$

with

$U = \begin{bmatrix}-1& 3 &-3 \\0 &-6 &5 \\0& 0 &1\end{bmatrix}$

if needed

$L = \begin{bmatrix}1& 0 &0 \\0 &1 &0 \\5& 3 &1\end{bmatrix}$

The eigenvalues are the same as $U$ which are $-1$,$-6$ and $1$. When I tried to do it the normal way I ended up with a not so nice algebra problem to work on which took way to long. Is there some special property I am missing here? If not is there an easy way to simplify $\mathrm{det}(A-\lambda I)$ that I am missing? Thank you!

  • 0
    Maybe something to do with the fact that the diagonal did not change and we know that the sum of the eigenvalues of a matrix is it's trace and the product of the eigenvalues of a matrix is the det(A)=det(U)?2012-12-16

1 Answers 1

3

It's hard to say if this is mere coincidence or part of a larger pattern. This is like asking someone to infer the next number to a finite sequence of given numbers. Whatever number you say, there is always some way to explain it.

Anyway, here's the "pattern" I see. Suppose $ A = \begin{pmatrix}B&u\\ v^T&\gamma\end{pmatrix}, $ where

  1. $B$ is a 2x2 upper triangular matrix;
  2. the two eigenvalues of $B$, say $\lambda$ and $\mu$, are distinct and $\neq\gamma$;
  3. $u$ is a right eigenvector of $B$ corresponding to the eigenvalue $\mu$;
  4. $v$ is a left eigenvector of $B$ corresponding to the eigenvalue $\lambda$.

Then $A$ has the following LU decomposition: $ A = \begin{pmatrix}B&u\\ v^T&\gamma\end{pmatrix} =\underbrace{\begin{pmatrix}I_2&0\\ kv^T&1\end{pmatrix}}_{L} \quad \underbrace{\begin{pmatrix}B&u\\0&\gamma\end{pmatrix}}_{U} $ where $k=\frac1\lambda$ if $\lambda\neq0$ or $0$ otherwise. The eigenvalues of $U$ are clearly $\lambda,\mu$ and $\gamma$. Since $u$ and $v$ are right and left eigenvectors of $B$ corresponding to different eigenvalues, we have $v^Tu=0$. Therefore \begin{align} (v^T, 0)A &=(v^T, 0)\begin{pmatrix}B&u\\ v^T&\gamma\end{pmatrix} =(v^TB,\, v^Tu)=\lambda(v^T,0),\\ A\begin{pmatrix}u\\0\end{pmatrix} &=\begin{pmatrix}B&u\\ v^T&\gamma\end{pmatrix}\begin{pmatrix}u\\0\end{pmatrix} =\begin{pmatrix}Bu\\v^Tu\end{pmatrix} =\mu\begin{pmatrix}u\\0\end{pmatrix},\\ A\begin{pmatrix}\frac{1}{\gamma-\mu}u\\1\end{pmatrix} &=\begin{pmatrix}B&u\\ v^T&\gamma\end{pmatrix} \begin{pmatrix}\frac{1}{\gamma-\mu}u\\1\end{pmatrix} =\gamma\begin{pmatrix}\frac{1}{\gamma-\mu}u\\1\end{pmatrix}. \end{align} So, the eigenvalues of $U$ are also the eigenvalues of $A$.

  • 0
    Lol! Thanks, I will keep that in mind ;) Yeah I hadn't either but figured I would check. I will just keep doing it the good old way that always works. Just sometimes I don't like the nasty characteristic equation I end up with. Thanks again though!2012-12-16