2
$\begingroup$

Given the tridiagonal symmetric infinite matrix of 0 and 1's

$$ \left( \begin{matrix} 0&1&0&0&\ldots&0\\ 1&0&1&0&\ldots&0\\ 0&1&0&1&\ldots&0\\ \ldots&\ldots&\ldots&\ldots&\ldots&\ldots\\ 0&0&0&\ldots&0&1\\ 0&0&0&\ldots&1&0\\ \end{matrix}\right) $$

How do you go about solving for the largest eigenvalues/eigenvectors? From a physical perspective, this is analogous to coupled harmonic oscillators along a linear chain, thus I expect the eigenvectors to look like the fundmental modes of a string with both ends fixed (i.e. in the continuum limit with scaled coordinates they look something like $\sin(nx)$).

I can solve for any finite matrix, but I'd like to understand how to solve this problem when the number of rows $N \rightarrow \infty$.

  • 0
    They are in fact sines, and you can use this to explicitly write down both the eigenvectors and the eigenvalues for all $N$.2012-02-27
  • 0
    @QiaochuYuan as I suspect, but the question is _how_ do you determine this when faced with the problem for the first time? I like to learn the method so that I can solve other, more difficult problems of this type.2012-02-27
  • 0
    I don't know a general method. This question is quite special.2012-02-27
  • 0
    Well, I know a method for the eigenvalues, which is to write down a recurrence for the characteristic polynomial and compute the corresponding generating function or recognize the recurrence (it turns out to be a modified form of the Chebyshev polynomials of some kind or other). But I don't know a good reason to expect that the eigenvectors are as nice as they are other than the continuum limit or a direct computation.2012-02-27
  • 0
    @QiaochuYuan do you mind expanding that to an answer? I think it might be helpful in my understanding of the problem to see this "special" matrix worked out via recurrence relations.2012-02-27
  • 1
    See http://qchu.wordpress.com/2009/06/07/the-catalan-numbers-regular-languages-and-orthogonal-polynomials/ .2012-02-27

1 Answers 1

2

If $H$ is a Hilbert space with basis $\{e_n\}$ and $S$ is this shift operator given by $Se_n=e_{n+1}$, then your matrix is the operator $S+S^*$. This operator has no eigenvalues. Because if $x=\sum_n\alpha_ne_n$ is an eigenvector with eigenvalue $\lambda$, you would have $$ (S+S^*)x=\lambda x, $$ which translates into $$ \sum_n\lambda\alpha_n=\lambda x=(S+S^*)x=\sum_{n=1}^\infty\alpha_ne_{n+1}\ + \ \sum_{n=2}^\infty\alpha_ne_{n-1}=\alpha_2e_1+\sum_{n=2}^\infty(\alpha_{n-1}+\alpha_{n+1})e_n. $$ So the coefficients $\alpha_n$ have to satisfy the recursion $$ \alpha_2=\lambda \alpha_1, \ \ \alpha_{n+1}=\lambda \alpha_n-\alpha_{n-1}. $$ We can always assume $\alpha_1=1$, and so we have $$ \alpha_1=1,\ \alpha_2=\lambda,\ \alpha_3=\lambda^2-1,\ \alpha_4=\lambda^3-\lambda-1, \ \ldots $$ that is $$ \alpha_{n+1}=\lambda^n-\sum_{j=0}^{n-2}\lambda^j=\lambda^n-\frac{1-\lambda^{n-1}}{1-\lambda}=\frac{\lambda^n-\lambda^{n+1}+\lambda^{n-1}-1}{1-\lambda}. $$ It is not hard to see that no choice of $\lambda$ will make the sequence $\{\alpha_n\}$ lie in $\ell^2(\mathbb{N})$, since we can never have even $\alpha_n\to0$.

(Since you are not putting your matrix in the context of operators on Hilbert spaces, one could argue that the above computation actually allows you to find eigenvectors without the $\ell^2$ restriction; but then any complex number would be an eigenvalue and in particular you cannot expect to diagonalize your matrix)

  • 0
    I'm not sure I completely understand the take-home from this answer. Are you saying that, in the limit of infinite $N$, there are no (or any) possible eigenpairs? For any finite matrix of this type they clearly exist. They approach a sine wave as I and Qiaochu have mentioned and the eigenvalues can be found explicitly. Does this have something to do with the convergence of the series of eigenvalues?2012-02-27
  • 0
    @Hooked: Martin is only claiming that the infinite matrix has no eigenvectors whose entries are square-summable. As his computation shows, there are plenty of eigenvectors if you relax this requirement, even plenty of eigenvectors whose entries are bounded.2012-02-27
  • 0
    @Hooked: in finite dimension, you have the result (via the Spectral Theorem) that a hermitian operator can be "diagonalized", i.e. written as a sum of scalar multiples of the projections onto its eigenspaces (or if you want, that you can find an orthonormal basis made of eigenvectors). In infinite dimension, that result is not true.2012-02-27
  • 0
    @Martin: well, in some sense that's just the wrong generalization; you need to allow "distributional eigenvectors." The correct generalization can be found at http://en.wikipedia.org/wiki/Spectral_theorem#Bounded_self-adjoint_operators .2012-02-27
  • 0
    @QiaochuYuan: I'm not sure what you mean by "distributional eigenvectors". As an operator algebraist, I'm well versed in the Spectral Theorem, spectral measures, etc. In infinite dimension you can certainly talk about the spectrum (it is indeed a key concept), but there is no obvious or meaningful way to attach a vector (an "eigenvector") to an arbitrary element of the spectrum. The best you can say is that any normal operator $T$ on a Hilbert space can be written as $\int_{\sigma(T)} \lambda dE(\lambda)$, where $E$ is a projection-valued measure on the spectrum $\sigma(T)$ of $T$.2012-02-27
  • 0
    @Martin: for a multiplication operator $T_f(g) = fg$ acting on, say, $L^2[0, 1]$, you can think of the Dirac delta function supported at a point $x \in [0, 1]$ as an eigenvector with eigenvalue $f(x)$.2012-02-27
  • 0
    @QiaochuYuan: those are murky waters, in my opinion. For starters, $\delta_x$ is not in $L^2[0,1]$. Regarding the actual question here, the operator $s+s^*$ can actually be seen as the Toeplitz operator with symbol $z+\bar{z}$, so it is somehow a multiplication operator. But it cannot be expressed as a diagonal matrix. And I would be surprised if there is an explicit expression for its spectral measure.2012-02-27
  • 0
    @Martin: I don't see the point of staying in $L^2[0, 1]$ if it doesn't provide the eigenvectors I need. Isn't this precisely what the machinery of rigged Hilbert spaces was invented to accomplish?2012-02-27
  • 0
    @QiaochuYuan: I guess, but I know nothing about rigged Hilbert spaces. Can you use them to find eigenvectors and eigenfunctions in the example from the OP?2012-02-27