4
$\begingroup$

For a Markov chain

the Cheeger bound is a bound of the second largest eigenvalue of the transition matrix of a finite-state, discrete-time, reversible stationary Markov chain. It can be seen as a special case of Cheeger inequalities in expander graphs.

Let $X$ be a finite set and let $K(x,y)$ be the transition probability for a reversible Markov chain on $X$. Assume this chain has stationary distribution $\pi$. $$ 1 - 2 \Phi \leq \lambda_2 \leq 1 - \frac{\Phi^2}{2}. $$

where $\Phi$ is the edge expansion of the graph for the random walk corresponding to the MC. Note that a homogeneous DTMC can be seen as a random walk on a directed graph with its vertex set being the state space of the MC.

For a graph

When $G$ is d-regular, there is a relationship between edge expansion $h(G)$ and the spectral gap $d - \lambda_2$ of $G$. An inequality due to Tanner and independently Alon and Milman[7] states that $$ \frac{1}{2}(d - \lambda_2) \le h(G) \le \sqrt{2d(d - \lambda_2)}\,.$$

I cannot understand why the MC version is a special case of the graph version. Specifically, how can a reversible finite-state MC be seen as a regular graph? In particular, the stationary/reversible/limiting distribution of a reversible finite-state MC is not necessarily uniform.

I suspect the articles miss something? Maybe someone happen to know more general versions of Cheeger's inequality, which can have the two versions here as special cases?

Thanks and regards!

  • 0
    Note that nowhere do they say that *a reversible finite-state MC can be seen as a regular graph*--this one is of your own invention.2012-12-14

1 Answers 1

3

Both versions of the Cheeger inequality follow from the version for weighted graphs proved by Chung (see e.g. [1]).

Let $G$ be an undirected graph in which edge $(u,v)$ has non-negative weight $\pi(u,v)$ (if there is no edge between $u$ and $v$ then $\pi(u,v) = 0$). Let the degree of vertex $v$ be the total weight of all edges incident to $v$. Define the Laplacian matrix ${\cal L}$ as follows: $${\cal L} = \begin{cases} 1 - \frac{\pi(u,v)}{d_u} , &\text{if } u = v,\\ -\frac{\pi(u,v)}{\sqrt{d_ud_v}} , & \text{if } u\neq v. \end{cases} $$

Let $\lambda_1$ be the second smallest eigenvalue of $\cal L$ and $$h = \min_{X\subset V} \frac{\sum_{u\in X, v\notin X}\pi(u,v)}{\min(\sum_{\in X} d_u,\sum_{u\notin X} d_u)},$$ where $X\neq \varnothing$ and $X\neq V$. Then $$\frac{h^2}{2} \leq \lambda_1 \leq 2h.$$

The Cheeger inequality for $d$-regular graphs follows immediately if we let $\pi(u,v) = 1$ iff $(u,v) \in E$.

The Cheeger inequality for reversible MC follows if we let $\pi(u,v) = p(u)\cdot K(u,v)$ (where $p(u)$ is the stationary distribution and $K(u,v)$ is the transition probability).

[1] F. R. K. Chung. Laplacians of graphs and Cheeger inequalities. Available at: http://www.math.ucsd.edu/~fan/wp/cheeger.pdf

  • 0
    +1 Thanks! The articles I referred to are the two linked Wikipedia articles. Is your reference "Chung, Fan R. K. (1997), Spectral Graph Theory"?2012-12-14
  • 0
    I see :). I updated the answer.2012-12-14
  • 0
    @Tim, Chang has several very nice articles and surveys on graph Laplacians and Cheeger inequalities. I added a reference to one of these articles that has this result.2012-12-14
  • 0
    Thanks! Chung's general version is in terms of eigenvalues of the Laplacian matrix, while the two special versions in my post are in terms of eigenvalues of the adjacency matrix. So I wonder how the eigenvalues of the Laplacian matrix and of the adjacency matrix of the same graph are related?2012-12-14
  • 0
    @Tim: We can define an “adjacency matrix” as $A = I -{\cal L}$ (where $\cal L$ as in my post) then $\lambda_1 = 1 - \mu$, where $\mu$ is the second largest eigenvalue of $A$. That is, $\frac{h^2}{2} \leq 1 - \mu \leq 2h$. You use this matrix $A$ in your examples. It's not directly related to the “standard adjacency matrix”. (Note that it's not even clear what the *largest* eigenvalue of the standard adjacency matrix is.)2012-12-14
  • 0
    (1) "You use this matrix A in your examples. It's not directly related to the 'standard adjacency matrix'". Are the adjacency matrix and the 'standard adjacency matrix' the same concept? What does "it's not directly related to the 'standard adjacency matrix'" mean? (2) Consider some special adjacency matrices: the largest eigenvalue for a stochastic matrix is 1, and the largest eigenvalue for a k-regular graph is k. By "the largest eigenvalue of the standard adjacency matrix is", do you mean the eigenvalues of a general adjacency matrix may not be comparable (i.e. complex numbers)?2012-12-15
  • 0
    @Tim: (1) I defined a matrix $A$ s.t. it is possible to write Cheeger's inequality in terms of the spectral gap for $A$. In some special cases this matrix coincides with the (standard) adjacency matrix, but in general it is different. (2) If the graph $G$ is $d$-regular the largest eigenvalue of the (standard) adjacency matrix is $d$; however, if $G$ is not regular, the largest eigenvalue of the adjacency matrix doesn't have a simple combinatorial meaning. Eigenvalues of the adjacency matrix are always real numbers since it is symmetric.2012-12-15