2
$\begingroup$

I have written a linear solver employing Householder reflections/transformations in ANSI C which solves Ax=b given A and b. I want to use it to find the eigenvector associated with an eigenvalue, like this:

(A-lambda*I)x = 0

The problem is that the 0 vector is always the solution that I get (before someone says it, yes I have the correct eigenvalue with 100% certainty).

Here's an example which pretty accurately illustrates the issue:

Given A-lambda*I (example just happens to be Hermitian):

1 2 0 | 0
2 1 4 | 0
0 4 1 | 0

Householder reflections/transformation will yield something like this

# # # | 0
0 # # | 0
0 0 # | 0

Back substitution will find that solution is {0,0,0}, obviously.

  • 0
    If you know that $\lambda$ is an eigenvalue, then there must be a nonzero solution to the equation, or you're doing something wrong. Back-substitution has nothing to do with this issue. If $A$ is upper triangular, then $\lambda$ must be one of the entries on the diagonal. Then $A-\lambda I$ will have 0 in that position, and when you back-substitute, the variable corresponding to that position will not be forced to be 0.2011-10-25
  • 0
    The zero vector is the *only* solution, unless $\lambda$ is an eigenvalue, so the first question is, are you sure that the values of $\lambda$ using are using are eigenvalues? And, if so, is it possible that something is rounding $\lambda$, which would turn it into a non-eigenvalue? Beyond that, there's probably not much to be said without knowing how your particular linear solver works.2011-10-25
  • 0
    The linear solver works by performing householder reflections with an augmented matrix containing the "b" vector. A back-substitution then occurs starting with the bottom most entry which requires only division to extract the column variable.2011-10-25
  • 0
    Since the householder reflections operating on the augmented matrix do not effect the last augmented column, I think that one would always obtain the zero vector for a solution this way... Another option is that you have to be dead on (within 1e-6 of the normalized eigenvalue) with what eigenvalue you have, but I really don't like the idea of that, it doesn't seem right.2011-10-25
  • 1
    Something is obviously wrong with the numbers in your example. That matrix has full rank, so $\lambda$ can't have been an eigenvalue of $A$.2011-10-25
  • 1
    What's your $A$ to begin with, and what's your $\lambda$?2011-10-25
  • 0
    Any A that I can choose is a good example. I can find the eigenvalues, (which agree with Mathematica), and then perform this method. I always get the 0 vector. I generate matrices with random entries. Occasionally I'll seed only Hermitian matrices.2011-10-25
  • 0
    If you like examples here's one: {{0,1,1},{1,0,1},{1,1,0}} eigenvalues -1,-1,2.2011-10-25
  • 1
    Ok, so for $\lambda=-1$ you get that $A-\lambda I$ is {{1,1,1},{1,1,1},{1,1,1}}, which won't give you the zero solution only. What's the problem? What I was wondering was what the numbers were for the example you give in the question, because there's clearly something wrong there.2011-10-25
  • 0
    It's possible that you aren't understanding the question. Householder transformations are applied after A-lambda*I, this is a critical step... Thanks for your help in any case.2011-10-25
  • 1
    Very well, what do you get from your Householder transformations on {{1,1,1},{1,1,1},{1,1,1}} then? (By the way, is it so difficult to just tell us what your numbers were for that example? This is the third time I'm asking, because simply looking those numbers, it seems that there's something that *you* have misunderstood...)2011-10-25

2 Answers 2

2

It looks to me you're not doing inverse iteration properly. Usually, one already has a pivoted(!) LU decomposition of the matrix $\mathbf A-\lambda\mathbf I=\mathbf P^\top\mathbf L\mathbf U$ already available when doing this; I'll assume that as a given. (I am not assuming that your matrix is Hermitian/symmetric; one can definitely do better than this general method I am about to describe in that case.)

The traditional way of doing inverse iteration, due to Jim Wilkinson, builds up an initial eigenvector estimate by solving the system $\mathbf U\mathbf x_0=\mathbf e$, where $\mathbf e=(1 \dots 1)^\top$. In this operation, it is assumed that any zero diagonal elements in $\mathbf U$ have already been replaced by quantities around the size of $\|\mathbf A\|\varepsilon$, where $\|\cdot\|$ is any convenient norm and $\varepsilon$ is machine epsilon. One then proceeds with the cycle of solving $(\mathbf A-\lambda\mathbf I)\mathbf x_{k+1}=\mathbf x_k$ (using the LU decomposition, of course!) and then normalizing the $\mathbf x_{k+1}$ so obtained with some convenient vector norm. In practice, the method usually converges within two to three iterations, and a fourth iteration is rarely needed, if ever.

One might think it is perverse to be solving a system whose underlying matrix is nearly singular; it turns out that the growth of the unnormalized $x_k$ is in fact vital for the method to work in inexact arithmetic. See Ilse Ipsen's article on this method for details.

  • 0
    Inverse iteration was a much better method. This was exactly the sort of thing I was looking for. Thank you.2011-10-25
  • 0
    I should add for anyone reading this in the future: Householder transformations on an augmented matrix accompanied by a linear solve including back substitution is a method of accomplishing this, but it appears to have large numerical error for matrices larger than 5x5.2011-10-27
1

I have coded this before, but cannot comment on posts yet, so must 'answer'. If you are trying to do QR factorization with the Householder reflections, I think you might have something wrong in your code. Find a problem you can solve using them by hand first. Mathematica uses these reflections in QRDecomposition, so you might be able to use it to check your work as well. I hope that you get things working soon.

  • 0
    I have already verified that my QR factorization, QR algorithm, and Lanczos algorithm function together. They can produce the eigenvalues of a matrix with 100,000 entries very quickly and produce the correct values. I'm simply looking at getting the eigenvector back out afterwards.2011-10-25