1
$\begingroup$

I am reading a proof regarding the eigenvalue bound in communication complexity (in Arora and Barak book on computational complexity), and the proof uses the fact that for all unit vectors $x,y$, where $M$ is symmetric real matrix the following inequality holds: $ x^T M y \leq \lambda_\max(M) \big| x \cdot y \big|, $ where $x\cdot y$ is the dot product of the two vectors.

I don't see this fact immediately. The closest trace I could find is about matrix norms, but matrix norm is about $x^T M x$ not $x^T M y$. Another problem also is that the vectors the proof actually uses are binary vectors (characteristic function of a certain set), not unit vectors as the hypothesis of this claim.

Edit: Every entry in $M$ will either be 1 or -1, so it can not be diagonal (as in paul garrett's comment below).

The statement is used in proof of Lemma 12.13 (Eigenvalue bound) in this chapter draft of the book.

  • 2
    Apropos of @paul's counterexample, here is another with all entries $1$ or $-1$. Take M = \begin{bmatrix}1 & 1 \\ 1 & 1\end{bmatrix}, $x = (1, 0)$, and $y = (0, 1)$. Then $x^T M y = 1$ but $x \cdot y = 0$, so the stated inequality cannot hold.2011-08-27

2 Answers 2

2

Ok. I took a look at the book via the link provided by the OP. As I suspected to some extent the inequality they are using is actually the familiar one $ X^TMY\le \lambda_{max}|X|\cdot |Y|, $ but they have bungled up the presentation by adding an errorneous intermediate step that may easily confuse the reader. The ingredients are the following. $M$ is, indeed, a symmetric real matrix. The symmetricity is not listed in the statement of this Lemma, but holds for all the matrices $M(f)$ defined earlier. I can only assume that this is an omission, for otherwise it should be easy to find counterexamples. Anyway, $\lambda_{max}$ is the magnitude of the largest eigenvalue (i.e. the largest absolute value). Given this (assuming that it matches with all the uses of this result), the proof is easy. Because $M$ is real and symmetric, its eigenvalues are all real, and we can find an orthonormal set of eigenvectors $\{\vec{x}_1,\ldots,\vec{x}_n\}$ such that $M\vec{x}_i=\lambda_i\vec{x}_i$. We can write $Y$ using this eigenbasis as follows $Y=\sum_i b_i\vec{x}_i$, so $MY=\sum_i b_iM\vec{x}_i=\sum_i \lambda_ib_i\vec{x}_i$. Then by Cauchy-Schwarz we get $ X^TMY=\langle X, MY\rangle\le |X||\sum_i \lambda_ib_i\vec{x}_i|= |X|\sqrt{\sum_i \lambda_i^2b_i^2}\le|X|\lambda_{max}\sqrt{\sum_i b_i^2}=\lambda_{max}|X||Y|, $ as claimed.

In that proof we have $X=1_A$ = the characteristic function of a set $A$, so $|X|=\sqrt{|A|}$ as the authors claim. Similarly $Y=1_B$, so $|Y|=\sqrt{|B|}$. Putting these pieces together we get $ X^TMY=1_A^TM1_B\le \lambda_{max}\sqrt{|A|\cdot|B|} $ recovering the claim of the authors. Their intermediate step does, indeed, involve the inner product $1_A^T1_B$, but the value of that inner product is $|A\cap B|$, and we don't see that anywhere in the final claim, so this is IMVHO a booboo. This is a draft version after all.

I obviously didn't read all of the chapter, just skimmed thru it to see the definition of $M(f)$ = a symmetric matrix with entries in $\{0,1\}$. If they apply the lemma to non-symmetric matrices as well, then we need to take a closer look.

=================================================

Edit: To make it clear. IMHO the authors made a mistake in claiming the sequence of inequalities:

$ 1_A^TM1_B\le\lambda_{max}|1_A^T1_B|\le\lambda_{max}\sqrt{|A|\cdot|B|}, $

when it should have read

$ 1_A^TM1_B\le\lambda_{max}|1_A|\cdot|1_B|=\lambda_{max}\sqrt{|A|\cdot|B|}. $

In the end it makes no difference, because their proof only needs the inequality

$ 1_A^TM1_B\le\lambda_{max}\sqrt{|A|\cdot|B|}. $

IOW, my guess is that the authors knew where they were heading, and accidentally inserted an incorrect intermediate step.

  • 0
    Thank you very much, it is very clear now :) !2011-08-28
0

If $\lambda_\max>0$ I have the following proof. In addition, as mentioned by thei, whether $x,y$ are unit-length does not matter.

Since $M$ is symmetric, it can be decomposed as $M=U^T\Lambda U$ with $U$ as an orthogonal matrix and $\Lambda$ as a diagonal matrix with the eigenvalues of $M$ as its diagonal entries.

$x^TMy=x^TU^T\Lambda Uy=\bar{x}^T\Lambda \bar{y}=\sum \bar{x}_i\bar{y}_i \lambda_i\le\sum|\bar{x}_i\bar{y}_i \lambda_i|\le\sum |\bar{x}_i\bar{y}_i| \lambda_\max$ $=|\bar{x}\cdot \bar{y}| \lambda_\max=|x\cdot y| \lambda_\max$

Correction: Thanks Ben and paul. $\sum|\bar{x}_i\bar{y}_i|\ne|\bar{x}\cdot\bar{y}_i|$ in general.

  • 0
    Thanks Ben and paul! Sorry. There is a mistake...2011-08-27