3
$\begingroup$

I am very curious about the proof of Algebraic connectivity

Algebraic connectivity:

The algebraic connectivity of a graph $G$ is the second-smallest eigenvalue of the Laplacian matrix of $G$. This eigenvalue is greater than $0$ if and only if $G$ is a connected graph.This is a corollary to the fact that the number of times $0$ appears as an eigenvalue in the Laplacian is the number of connected components in the graph.

For details : Algebraic connectivity on Wikipedia.

I found this claims very interesting, how exactly the second smallest eigenvalue can be the sign of connectivity of the graph.

Following fact not less interesting,

Denote eigenvalues by $\lambda_1 \leq \lambda_2 \leq \cdots \leq \lambda_n$, then $\lambda_1=0$. This can be proved by using the fact that the laplacian matrix is positive semidefinite which implies that it has nonnegative eigenvalues , and showing that 0 is eigenvalue of this matrix corresponding to the vector $ \langle 1,1, \cdots, 1 \rangle$.

So far I didn't do the proof of Algebraic connectivity. If you have a link, I will appreciate publishing it.

Thanks!

  • 0
    I have added another answer to this que. Any comment?2016-09-10

2 Answers 2

2

Let $G = (V, E)$ be a finite graph and let $\Delta$ denote its Laplacian (here we take the convention that the Laplacian is positive semidefinite). For simplicity name the vertices $1, 2, ... n$. Then $\Delta$ is the matrix associated to the quadratic form $q(x_1, ... x_n) = \sum_{(i, j) \in E} (x_i - x_j)^2$

which one might call the Dirichlet energy. Recall that, for an appropriate change of variables, such a quadratic form can be written as $q(z_1, ... z_n) = \sum_i \lambda_i z_i^2$

where $\lambda_i$ are the eigenvalues of $\Delta$. Thus to determine the multiplicity of the zero eigenvalue it suffices to determine when $q$ can be zero. But from the first expression it should be clear that $q = 0$ if and only if $x_i = x_j$ whenever $(i, j) \in E$; that is, whenever $(x_1, ... x_n)$ determines a function $x : V \to \mathbb{R}$

which is constant on each connected component of $G$. The dimension of this space of functions is clearly the number of connected components of $G$, and the conclusion follows.

  • 1
    @QiaochuYuan Why " The dimension of this space of functions is clearly the number of connected components of G"?2017-11-10
0

second proof

The proof given above has the weakness that it does not tell someone that it is dealing with another eigenvalue. That is why this fact "But from the first expression it should be clear that $q = 0$ if and only if $x_i = x_j$ whenever $(i, j) \in E$; that is, whenever $(x_1, ... x_n)$ determines a function $x : V \to \mathbb{R}$" simply means the eigenvector of the second eigenvalue is $c \left( \begin{array}{c} 1 \\ 1 \\ \vdots \\ 1 \end{array} \right) $ where $ c= x_i, \; \forall i $.\

To take care of this problem we suggest the following proof instead.\

Suppose $ \lambda_2=0 $, then, since $ \lambda_1=0 $, the multiplicity of $ 0 $ the eigenvalue of $L , \text{ ( The Laplacian Matrix of G )}, $ is at least 2.( we say that it is at least two to ensure no loss of generality).\

From the inner product of rows lemma $ L=DD^T $, where $ D $ is the incidence matrix of an oriented graph from $G$.\

If $ 0 $ has multiplicity at least 2, then $ L=DD^T $ has a rank at most $ n-2 $.\

That is, $n-2 \geq rank(DD^T)=rank(D)=n-c_0 $, where $ c_0 $ is the number of components of $G$. \

This means that $ n-c_0 \leq n-2 \Leftrightarrow c_0 \geq 2 $.\

That is $ G $ has at least $2$ components. That is it is disconnected. The same approach is also works for the converse and the corollary.