4
$\begingroup$

Let $A$ be real , symmetric of order $n$ with $A > 0$ and the eigenvalues of $A$ are $a_1 \ge a_2 \ge \cdots \ge a_n > 0$. Let $B$ be a real matrix of order $n\times m$ ($m\le n$), $\operatorname{rank}B=m$ and whose singular values are $b_1 \ge \cdots \ge b_m > 0$. If \[X = \begin{pmatrix} A & B \\ B^t & 0\end{pmatrix}\] then show that the spectrum of $X$ is a subset of $P \cup Q$ where $P =\left [ \frac 12\left(a_n-\sqrt{a_n^2+4b_1^2}\right) , \frac 12\left(a_1-\sqrt{a_1^2+4b_m^2}\right)\right]$ and $ Q = \left[ a_n , \frac 12 \left(a_1+\sqrt{a_1^2+4b_1^2}\right)\right].$

Thanks for any help.

  • 0
    @Arnold X(a b)^T= t(a b)^T.From this we get 2 equations from which we get a quadratic in t ,but from that I am unable to prove this result.2012-11-07

2 Answers 2

1

Writing ${u\choose v}$ for the eigenvector, we find the equations $Au+Bv=\lambda u,~~~B^Tu=\lambda v.$

Case 1. if $\lambda=-\mu\le 0$, we may eliminate $u$ from the first equation and get $v\ne 0$ (as otherwise $u=0$, contradiction) and $B^T(A+\mu I)^{-1}Bv=\mu v$. Multiplication by $-v^T$ from the left gives $(Bv)^T(A+\mu I)^{-1}Bv =\mu v^Tv.$ Now in the positive semidefinite ordering, $a_n I\le A\le a_1 I$, hence $(a_1+\mu)^{-1}I\le (A+\mu I)^{-1}\le (a_n+\mu)^{-1}I.$ Therefore $(a_1+\mu)^{-1}(Bv)^TBv \le \mu v^Tv \le (a_n+\mu)^{-1}(Bv)^TBv$. By properties of the Rayleigh quotient, $(a_1+\mu)^{-1} b_m \le \mu \le (a_n+\mu)^{-1}b_1.$ These two inequalities imply $2\mu\in \left[-a_n+\sqrt{a_n^2+b_1^2},-a_1+\sqrt{a_1^2+b_m^2}\right]$, hence $\lambda\in P$.

Case 2. If $\lambda > 0$, we may eliminate $v$ from the second equation and get $u\ne 0$ (as otherwise $v=0$, contradiction) and $\lambda^2u-\lambda Au-BB^Tu=0$. Multiplication by $u^T$ from the left gives $\lambda^2u^Tu-\lambda u^TAu-u^TBB^Tu=0$, and by properties of the Rayleigh quotient, $\lambda^2-\lambda a_1-b_1^2\le 0\le \lambda^2-\lambda a_n-b^2,$ where $b=b_m$ if $m=n$ and $b=0$ if $m. These two inequalities imply $2\lambda\in Q':=\left[a_n+\sqrt{a_n^2+b^2},a_1+\sqrt{a_1^2+b_1^2}\right].$ Since $Q'\subseteq Q$, this proves the assertion, and slightly strengthens it in case $m=n$.

  • 0
    @Timothy: intuitively: Draw a graph of the corresponding quadratic functions and you'll see where the inequalities hold. - formally: complete the squares, take square roots, and discard the signs not compatible with the case.2012-11-09
3

This smells very fishy. I hope you are not asking the others to do your homework.

Anyway, we will first quickly check that $X$ has $n$ positive eigenvalues and $m$ negative eigenvalues. In fact, we have the following congruence relation: $ \begin{pmatrix}I_n &0\\-B^TA^{-1} &I_m\end{pmatrix} \begin{pmatrix}A&B\\ B^T&0\end{pmatrix} \begin{pmatrix}I_n&-A^{-1}B\\0&I_m\end{pmatrix} =\begin{pmatrix}A&0\\0&-B^TA^{-1}B\end{pmatrix} = Y\ \textrm{ (say)}. $ So, by Sylvester's law of inertia, $X$ and $Y$ have the same numbers of positive, zero and negative eigenvalues. As $A$ is positive definite and $-B^TA^{-1}B$ is negative definite, we have finished this check.

Now we will show that the negative spectrum of $X$ lies inside $P$ and the positive spectrum lies inside $Q$. Put it another way, the endpoints of $P$ and $Q$ give upper and lower bounds to the positive and negative eigenvalues of $X$. Among these four endpoints, three of them can be obtained easily and we will tackle them first. The right endpoint of $P$ is a bit problematic and we will deal with it later. Suppose $\lambda$ (necessarily nonzero) is an eigenvalue of $X$ and $(u^T,v^T)^T$ is a corresponding eigenvector. Then $ \begin{cases} Au + Bv = \lambda u,\\ B^Tu = \lambda v. \end{cases} $ Clearly $u\not=0$, or else the above equations would give $v=0$, contradicting that $(u^T, v^T)^T$ is an eigenvector. So, WLOG, we may assume that $\|u\|=1$. Substituting $v = \frac{1}{\lambda}B^Tu$ into the first equation in the above, mutiply both sides by $\lambda u^T$ on the left, we get $\lambda^2 - \lambda u^TAu - u^TBB^Tu = 0$. Thus $ (\ast):\ \lambda = \frac12\left(u^TAu \pm \sqrt{(u^TAu)^2 + 4\|B^Tu\|^2}\right). $ So, for each positive eigenvalue, we have \begin{align} \lambda &= \frac12\left(u^TAu + \sqrt{(u^TAu)^2 + 4\|B^Tu\|^2}\right)\\ &\le \frac12\left(\max_{\|u\|=1}u^TAu + \sqrt{\max_{\|u\|=1}(u^TAu)^2 + 4\max_{\|u\|=1}\|B^Tu\|^2}\right)\\ &\le \frac12\left(a_1 + \sqrt{a_1 + 4b_1^2}\right). \end{align} Also, $ \lambda = \frac12\left(u^TAu + \sqrt{(u^TAu)^2 + 4\|B^Tu\|^2}\right) \ge u^TAu \ge \min_{\|u\|=1}u^TAu = a_n. $ This shows that the positive spectrum of $X$ lies inside $Q$. Next, we deal with the left endpoint of $P$, or the lower end of the negative spectrum of $X$. For $a>0$ and $b\ge0$, define $f(a,b) = \frac12\left(a - \sqrt{a^2+4b^2}\right)$. Hence each negative eigenvalue of $X$ is of the form $\lambda = f\left(u^TAu, \|B^Tu\|\right)$. Since $\frac{\partial f}{\partial a} = \frac12 - \frac {a}{2\sqrt{\ldots}} \ge 0$ and $\frac{\partial f}{\partial b} = \frac {-2b}{\sqrt{\ldots}} \le 0$, $ \lambda \ge \min_{\|u\|=1}f(u^TAu, \|B^Tu\|^2) \ge f(\min_{\|u\|=1}u^TAu,\ \max_{\|u\|=1}\|B^Tu\|^2) = f(a_n, b_1). $ This gives the left endpoint of $P$.

It remains to obtain the right endpoint of $P$. This is not as easy to tackle. Unless $m=n$, if you put $\lambda\le\max f(u^TAu, \|B^Tu\|^2) \le f(\max u^TAu, \min\|B^Tu\|^2)$, you will get $\lambda\le f(\max u^TAu, 0)=0$, which is completely useless (as negative numbers are always bounded above by 0). In other words, in maximizing $(\ast)$, we cannot optimize $u^TAu$ and $\|B^Tu\|^2$ separately. This looks too difficult, so I will turn to another approach.

In advanced matrix theory, we have a Courant-Fischer-Weyl min-max principle, which says that if the eigenvalues of an order-$N$ Hermitian matrix $X$ are ordered in descending order, say, $\lambda_1\ge\lambda_2\ge\ldots\ge\lambda_N$, then $ \lambda_k = \min_{\dim(S)=N-k+1}\max_{\begin{matrix}w\in S\\ \|w\|=1\end{matrix}} w^\ast Xw. $ Now, for every unit vector $w\in\mathbb{R}^{n+m}$, write $w^T = (\theta u^T,\sqrt{1-\theta^2}v^T)$, where $\theta\in[0,1]$ and $u,v$ are unit vectors in respectively $\mathbb{R}^n$ and $\mathbb{R}^m$. Then $w^TXw = \theta^2 u^TAu + 2\theta\sqrt{1-\theta^2}u^TBv.$

Since eigenvalues of $X$ are invariant under orthogonal transform, WLOG we may assume that $B$ is already a (perhaps non-square) diagonal matrix with its singular values lying on the diagonal. Let $\{u_1,\ldots,u_n\}$ be the canonical basis of $\mathbb{R}^n$ and $\{v_1,\ldots, v_m\}$ be the canonical basis of $\mathbb{R}^m$. Then $B^Tu_i=b_iv_i$ for $i=1,2,\ldots, m$. Now consider the $m$-dimensional subspace $W\subset\mathbb{R}^{n+m}$ spanned by $\{(u_i^T, -v_i^T)^T: i=1,2,\ldots,m\}$. If $w^T = (\theta u^T,\sqrt{1-\theta^2}v^T)\in W$, then \begin{align} w^TXw &= \theta^2 u^TAu + 2\theta\sqrt{1-\theta^2}u^TBv\\ &\le \theta^2 u^TAu - 2\theta\sqrt{1-\theta^2}b_m\\ &\le \theta^2 a_1 - 2\theta\sqrt{1-\theta^2}b_m. \end{align} Apply the min-max principle with $N=n+m$ and $k=n+1$, we get $ \lambda_{n+1} = \min_{\dim(S)=m}\max_{\begin{matrix}w\in S\\ \|w\|=1\end{matrix}} w^T Xw \le \max_{\begin{matrix}w\in W\\ \|w\|=1\end{matrix}} w^T Xw \le \max_{0\le\theta\le 1}\left\{\theta^2 a_1 - 2\theta\sqrt{1-\theta^2}b_m\right\}. $ The final step is to show that $ \max_{0\le\theta\le 1}\left\{\theta^2 a_1 - 2\theta\sqrt{1-\theta^2}b_m\right\} = \frac 12\left(a_1-\sqrt{a_1^2+4b_m^2}\right). $ This can be proved by using the completing-square method and first-year calculus/geometry and I will leave it to you.

  • 0
    @Timothy: For any Hermitian matrix $A$, its minimum and maximum eigenvalues are given by $\min\{u^\ast Au: \|u\|=1\}$ and $\max\{u^\ast Au: \|u\|=1\}$. This is a general property that can be derived from the min-max principle. Alternatively, you can diagonalize $A$ (by unitary transform) to get a diagonal matrix $D$, whose main diagonal is $(a_1,a_2,\ldots,a_n)$. Then for any unit vector $y=(y_1,\ldots,y_n)$, we have $y^TDy=\sum_i a_iy_i^2$. Hence the maximum is $a_1$ and the minimum is $a_n$.2012-11-07