2
$\begingroup$

Could someone please expand on

Method 9. Lagrange interpolation (page 17) at

http://www.cs.cornell.edu/cv/researchpdf/19ways+.pdf

because the summation runs from 0 to (n-1) but the eigenvalues are defined from 1 to n.

Also. is it true that this is an analytic solution? Is it practical for numerical methods? Because I have to compute exp(t*A) for SEVERAL t and its expensive. I thought that if I can get the product bit of the equation (in the paper) then I only need to "plug and chug" the exp(\lambda_j t) part (which is really cheap) for various t.

Alternatively, I could do a symbolic calculation for t,.... i have to compute exp(t*A)*v, then plot the elements of v against time.

  • 0
    If your matrix is large, you should look at Krylov methods (method 20 in the paper); this is used in the Expokit package that J.M. refers to. If you do (Schur) decomposition, then you do not need to re-compute the decomposition if you vary t, you only need to re-compute $\exp(tU)$. If your matrix is symmetric, the $U$ is diagonal so $\exp(tU)$ is very easy to compute.2012-04-27

0 Answers 0