2
$\begingroup$

Given the tridiagonal symmetric infinite matrix of 0 and 1's

$ \left( \begin{matrix} 0&1&0&0&\ldots&0\\ 1&0&1&0&\ldots&0\\ 0&1&0&1&\ldots&0\\ \ldots&\ldots&\ldots&\ldots&\ldots&\ldots\\ 0&0&0&\ldots&0&1\\ 0&0&0&\ldots&1&0\\ \end{matrix}\right) $

How do you go about solving for the largest eigenvalues/eigenvectors? From a physical perspective, this is analogous to coupled harmonic oscillators along a linear chain, thus I expect the eigenvectors to look like the fundmental modes of a string with both ends fixed (i.e. in the continuum limit with scaled coordinates they look something like $\sin(nx)$).

I can solve for any finite matrix, but I'd like to understand how to solve this problem when the number of rows $N \rightarrow \infty$.

  • 1
    See http://qchu.wordpress.com/2009/06/07/the-catala$n$-nu$m$bers-regular-languages-and-orthogonal-polyno$m$ials/ .2012-02-27

1 Answers 1

2

If $H$ is a Hilbert space with basis $\{e_n\}$ and $S$ is this shift operator given by $Se_n=e_{n+1}$, then your matrix is the operator $S+S^*$. This operator has no eigenvalues. Because if $x=\sum_n\alpha_ne_n$ is an eigenvector with eigenvalue $\lambda$, you would have $ (S+S^*)x=\lambda x, $ which translates into $ \sum_n\lambda\alpha_n=\lambda x=(S+S^*)x=\sum_{n=1}^\infty\alpha_ne_{n+1}\ + \ \sum_{n=2}^\infty\alpha_ne_{n-1}=\alpha_2e_1+\sum_{n=2}^\infty(\alpha_{n-1}+\alpha_{n+1})e_n. $ So the coefficients $\alpha_n$ have to satisfy the recursion $ \alpha_2=\lambda \alpha_1, \ \ \alpha_{n+1}=\lambda \alpha_n-\alpha_{n-1}. $ We can always assume $\alpha_1=1$, and so we have $ \alpha_1=1,\ \alpha_2=\lambda,\ \alpha_3=\lambda^2-1,\ \alpha_4=\lambda^3-\lambda-1, \ \ldots $ that is $ \alpha_{n+1}=\lambda^n-\sum_{j=0}^{n-2}\lambda^j=\lambda^n-\frac{1-\lambda^{n-1}}{1-\lambda}=\frac{\lambda^n-\lambda^{n+1}+\lambda^{n-1}-1}{1-\lambda}. $ It is not hard to see that no choice of $\lambda$ will make the sequence $\{\alpha_n\}$ lie in $\ell^2(\mathbb{N})$, since we can never have even $\alpha_n\to0$.

(Since you are not putting your matrix in the context of operators on Hilbert spaces, one could argue that the above computation actually allows you to find eigenvectors without the $\ell^2$ restriction; but then any complex number would be an eigenvalue and in particular you cannot expect to diagonalize your matrix)

  • 0
    @QiaochuYuan: I guess, but I know nothing about rigged Hilbert spaces. Can you use them to find eigenvectors and eigenfunctions in the example from the OP?2012-02-27