0
$\begingroup$

I'm currently reading about the conjugate gradient method for solving linear systems of equations, and the following inequality used to bound the multiplicative decrease of the error of the solution is stated without proof; I tried it for a bit with some elementary methods, and was curious to see some elementary proofs (such must exist) and some intuition behind them.

Let $z_1,\ldots,z_n\geq 0$ and $\lambda_1,\ldots,\lambda_n\geq 0$. Let $\kappa = \frac{\max_i\lambda_i}{\min_i\lambda_i}$. Then the following is true: $$ 1 -\frac{\left(\sum_{j=1}^{n} z_j\lambda_j^2\right)^2}{\sum_{j=1}^{n} z_j\lambda_j^3 \sum_{j=1}^{n} z_j\lambda_j}\leq \left(\frac{\kappa-1}{\kappa+1}\right)^2$$

  • 1
    It smells like the Kantorovich inequality (see, e.g., p. 143 [here](http://www-users.cs.umn.edu/~saad/IterMethBook_2ndEd.pdf) with $B=\mathrm{diag}(\lambda_i)$ and $x=B\sqrt{z}$). But are you sure that the square of the whole LHS term is correct?2017-02-13
  • 0
    apologies, you're absolutely right - the square should be on the other side. And yes, it seems that Kantorovich's inequality is the way to go. Thanks!2017-02-14

1 Answers 1

0

To answer my own question: as Algebraic Pavel said in the comments, this is indeed an instance of the Kantorovich inequality. One can substitute $t_i = z_i\lambda_i^2$ and then the wanted inequality is equivalent to $$\frac{\|t\|_2^4}{\left\left}\geq\frac{4\kappa}{(1+\kappa)^2}$$

where $D$ is a diagonal matrix with the $\lambda_i$ on the diagonal, which is I guess a known form of Kantorovich's inequality. Besides the proof pointed to by Algebraic Pavel, there's also this cute probabilistic proof that can be found here. It goes through the Cauchy-Schwarz inequality for covariance and a standard bound for the variance of a bounded real random variable.

If you have other cool proofs, they're welcome!