2
$\begingroup$

I'm learning multivariate analysis. Cauchy-Schwarz Inequality plays an important role in several multivariate techniques.

  1. Cauchy-Schwarz Inequality:Let b and d be any two p $\times$ 1 vectors. Then $$(b'd)^2\leq(b'b)(d'd)$$

  2. Extended Cauchy-Schwarz Inequality:Let b and d be any two p $\times$ 1 vectors and B be a p $\times$ p positive definite matrix. Then $$(b'd)^2\leq(b'Bb)(d'B^{-1}d)$$

It is not that difficult to prove. I'm NOT asking how to prove it.

My question:

Consider there is a p $\times$ p identity matrix in the right hand of the Cauchy-Schwarz Inequality, that is, (b'Ib)(d'Id). Why can we turn I into a positive definite matrix so that the Inequality still remains? How to understand this fact intuitively?

  • 0
    Is the LHS of 2. correct? Shouldn't there be a B there also?2012-11-17
  • 1
    @Berci No. It's just 1. applied to $B^{1/2}b$ and $B^{-1/2}d$.2012-11-17
  • 0
    @Berci I copied 1 and 2 from P78-P79 of Applied Multivariate Statistical Analysis written by Richard A. Johnson. I'm sure there is no typo.2012-11-17

1 Answers 1

2

I would say, it's about the possible scalar products one can impose on a given vector space.

If $B$ is a positive definite (symmetric) matrix, then $(u,v)\mapsto u'Bv$ just defines a scalar product. Cauchy-Schwarz inequality holds for any scalar product.

  • 0
    In fact any positive definite $B$ gives a scalar product, the symmetric $B$ is just the one that is canonically associated to it (in a given basis).2012-11-17
  • 0
    Well, the mapping above is symmetric only if $B=B'$. But, we can take $\displaystyle\frac{B+B'}2$ for any given positive definite $B$.2012-11-17