9
$\begingroup$

Given eigenvectors $v_1, v_2, \dots, v_n$ and eigenvalues $\lambda_1,\lambda_2,\dots,\lambda_n$, how do I construct a matrix whose eigenvectors and eigenvalues are $v$ and $\lambda$?

The straightforward way of doing this is to encapsulate all $n^2$ constraints into a linear system and solve for each element of the matrix $M_i$. I.e.,

$$ \begin{bmatrix} v_{11} & v_{12} & \dots & v_{1n} & 0 & 0 &\dots & 0 & \dots & 0 & 0 &\dots & 0 \\ 0 & 0 &\dots & 0 & v_{11} & v_{12} & \dots & v_{1n} & \dots & 0 & 0 &\dots & 0\\ & & & & & & \vdots \\ 0 & 0 &\dots & 0 & 0 & 0 &\dots & 0 & \dots & v_{n1} & v_{n2} & \dots & v_{nn}\\ \end{bmatrix} \begin{bmatrix} M_1 \\ M_2 \\ \vdots \\ M_{n^2} \end{bmatrix} = \begin{bmatrix} \lambda_1 v_{11} \\ \lambda_1 v_{12} \\ \vdots \\\lambda_n v_{nn} \end{bmatrix} $$

Is there a more elegant way?

  • 5
    In the basis consisting of the eigenvectors, the matrix would be diagonal, with the $\lambda_i$ as diagonal values, call it $D$. Next you write down the matrix whose columns are the coordinates of the $v_i$, call it $P$, and the matrix you are looking for is $P^{-1}DP$.2011-07-31
  • 3
    This is pretty standard - http://en.wikipedia.org/wiki/Eigendecomposition_of_a_matrix2011-07-31
  • 0
    @anon: Of course. I should've seen that2011-07-31
  • 0
    @Jacob: you have asked us to delete this question, but it even has an answer you accepted! It is generally not a good idea to delete a question which would imply deleting the work contributed by others.2011-08-05
  • 0
    @Mariano: Agreed. I accepted it because it was correct, but wanted it deleted because it seemed trivial on afterthought and I realized I should've thought about it a bit more.2011-08-06
  • 2
    @Jacob: it is *always* the case that one should have thought about everything better, and with time (hopefully) everything becomes —if not entirely trivial— at least easier.2011-08-06
  • 0
    @Mariano: Quite true :)2011-08-06

1 Answers 1

11

Your system of equations is $Mv_1=\lambda_1v_1,\ldots,Mv_n=\lambda_nv_n$. Or equivalently, $M(v_1,\ldots,v_n)=(\lambda_1 v_1,\ldots,\lambda_n v_n)$, where $V:=(v_1,\ldots,v_n)$ is the $n\times n$-matrix with columns $v_1,\ldots,v_n$. You can write this as $MV=VD$ where $D$ is the diagonal matrix with diagonal entries $\lambda_1,\ldots,\lambda_n$. So, assuming $V$ is invertible, that is, that your given eigenvectors are linearly independent, you get $M=VDV^{-1}$. Thus to calculate $M$ this way, all you need to do is to find the inverse of the matrix of eigenvectors, and multiply three matrices together.

  • 2
    Also, if you've been given *orthonormal* eigenvectors, computing $V^{-1}$ is particularly simple...2011-07-31