36
$\begingroup$

The principal eigenvector of the adjacency matrix of a graph gives us some notion of vertex centrality.

What do the second, third, etc. eigenvectors tell us?

Motivation: A standard information retrieval technique (LSI) uses a truncated SVD as a low-rank approximation of a matrix. If we truncate to rank 1, then we essentially have a PageRank algorithm. I was wondering if there are ways of interpreting the corrections introduced by higher eigenvectors.

Something similar to what the moments of a distribution tell us (e.g. first moment gives us the mean, second tells us the variance, third gives us skewness, etc).

  • 8
    The second smallest eigenvector can be used to partition the graph into clusters, via [spectral clustering](http://en.wikipedia.org/wiki/Spectral_clustering) from [wikipedia](http://en.wikipedia.org/wiki/Eigenvalues_and_eigenvectors). It's would be interesting to see the usage of all different eigenvectors2012-03-06
  • 6
    My note ---"Spectral Realizations of Graphs" ( http://daylateanddollarshort.com/math/pdfs/spectral.pdf )--- explains that *a matrix whose rows form an orthogonal basis of an eigenspace of a graph's adjacency matrix has columns that serve as the coordinate vectors of a geometric realization for which automorphisms are induced by rigid isometries*. That is, eigenvectors construct visual models of a graph's automorphic structure. The exact contribution a particular eigen-model makes to the understanding of that structure isn't necessarily obvious, but the pictures can be pretty cool. :)2012-03-09
  • 1
    Why do you ask about the spectrum of the adjacency as opposed to that of the signed and/or unsigned graph Laplacian matrix, which are the typical objects of study in spectral graph theory? Eg, http://www.math.ucsd.edu/~fan/research/cb/ch1.pdf, http://buzzard.ups.edu/bookreview/eigenspaces-graphs-beezer-review.pdf2012-07-17

1 Answers 1