6
$\begingroup$

In my previous problem, I made a typo. Now I restate it as a new problem.

Let $ \begin{bmatrix} A& B \\ B^* &C \end{bmatrix}$ be positive semidefinite, $A,C$ are of size $n\times n$. Is it true that $\quad \sum\limits_{i=1}^k\lambda_i\begin{bmatrix} A& B \\ B^* &C \end{bmatrix}\le \sum\limits_{i=1}^k\left(\lambda_i(A)+\lambda_i(C)\right)\quad, $ where $1\le k\le n$? Here, $\lambda_i(\cdot)$ means the $i$th largest eigenvalue of $\cdot\quad$

  • 0
    Yes, $A,C$ are Hermitian, but using minmax theorem I can proof the case $k=1$, and no more...2011-07-13

2 Answers 2

0

This question is answered by T. Ito at https://mathoverflow.net/questions/70689/ask-some-matrix-eigenvalue-inequalities

1

It holds for $n = 1$. Solving the quadratic equation (with trace and determinant) for the largest eigenvalue of the lhs matrix, the desired inequality reads $ \frac{a+c+\sqrt{(a-c)^2 + 4|b|^2}}{2} \leq a + c.$ Some algebra shows that (since $a+c \geq 0$) this is equivalent to $ |b|^2 \leq ac. $ This follows (more or less) from Jacobi's criterion.