5
$\begingroup$

I am looking for a proof using the min-max principle. Wikipedia seem to provide just that: http://en.wikipedia.org/wiki/Min-max_theorem#Cauchy_interlacing_theorem

But this part seems to be wrong:

This can be proven using the min-max principle. Let $\beta_i$ have corresponding eigenvector $b_i$ and $S_j$ be the $j$ dimensional subspace $S_j=\operatorname{span}\{b_1,\dots, b_j\}$, then $ \beta_j = \max_{x\in S_j,\|x\|=1}(Bx,x) =\max_{x\in S_j,\|x\|=1}(PAPx,x) =\max_{x\in S_j,\|x\|=1}(Ax,x)$

How is the shift from $PAPx$ to $Ax$ legal? $PAP$ is an $m\times m$ matrix while $A$ is an $n\times n$ matrix. $x$ can't fit both. Can anyone correct the proof?

  • 0
    That's right. I believe this is another mistake in this wikipedia article. B=PAP and$A$are of different dimensions, so$P$cannot be a square matrix. I think they're missing a transpose on one of the P's2012-01-01

1 Answers 1

1

The proof on wikipedia has many flaws indeed. The projection $P$ can be expressed with matrix $P=V*V'$ where columns of $V$ are eigenvectors $v_1, v_2, \dots v_m$ associated with $\alpha_1, \dots \alpha_m$ of $A$. So $V$ is $n*m$ matrix. Then you can write $B=V'*A*V$ and $\max (Bx,x) = \max x'*V'*A*V*x = \max (V*x)'*A*(V*x)$. When $\operatorname{norm}(x)=1$ then $\operatorname{norm}(Vx)=1$ as well. This is how I got to the end of the proof.