0
$\begingroup$

I'm trying to calculate the Chebychev polynomial $T_n(A)$ of a matrix $A$. Since $A$ is symmetric, my idea is to diagonalise $A$ like $X=U^\dagger A U$, then use the explicite expressions mentioned here:

$ T_n(x) = \begin{cases} \cos(n\arccos(x)), & \ x \in [-1,1] \\ \cosh(n \, \mathrm{arccosh}(x)), & \ x \ge 1 \\ (-1)^n \cosh(n \, \mathrm{arccosh}(-x)), & \ x \le -1 \\ \end{cases} \,\! $

where $x$ runs over all eigenvalues of $A$ and then transform back like $UT_n(X)U^\dagger$. While this work?

2 Answers 2

1

The key is if $B = P A P^{-1}$, then $B^m = P A^m P^{-1}$.

For instance, $T_4(x) = 8x^4-8x^2+1$. Hence, $T_4(A) = 8A^4-8A^2 + 1=8(P^{-1} \Lambda^4P)-8(P^{-1} \Lambda^2P) + (P^{-1}P) = P T_4(\Lambda) P^{-1}$

EDIT The same holds true if you define $T_k(x)$ through a trigonometric approach since the same argument holds for infinite series as well i.e. for instance, $e^{A} = P e^{\Lambda} P^{-1}$ You are better of using the recurrence $T_{k+1}(\Lambda) = 2 \Lambda T_k(\Lambda) - T_{k-1}(\Lambda)$ and then compute $T_k(A)$ as $P T_k(\Lambda) P^{-1}$.

  • 0
    Better off in a computational efficiency and accuracy sense. Since if you are interested in $T_k(\Lambda)$ at each $k$ and $\Lambda \in \mathbb{R}^{N \times N}$, then the cost in constructing $T_k(\Lambda)$ is just $\mathcal{O}(N)$ at each time step $k$.2012-11-06
1

I would suggest using the generating function $\sum_{n=0}^\infty T_n(x) t^n = \dfrac{1-tx}{1-2tx+t^2}$, which for a matrix becomes $ \sum_{n=0}^\infty T_n(A) t^n = (I - t A) ((1 + t^2)I - 2 t A)^{-1}$ and so $T_n(A)$ is the coefficient of $t^n$ in the Maclaurin expansion of the right side.

EDIT: Maybe better: $ \pmatrix{T_{n+1}(A)\cr T_n(A)} = \pmatrix{2A & -I\cr I & 0\cr}^n \pmatrix{A\cr I\cr} $

  • 0
    From the recurrence relation $T_{n+1} (A) = 2 A T_n(A) - T_{n-1}(A)$ with $T_0(A) = I$, $T_1(A) = A$.2012-11-06