I'm learning multivariate analysis. Cauchy-Schwarz Inequality plays an important role in several multivariate techniques.
Cauchy-Schwarz Inequality:Let b and d be any two p $\times$ 1 vectors. Then $(b'd)^2\leq(b'b)(d'd)$
Extended Cauchy-Schwarz Inequality:Let b and d be any two p $\times$ 1 vectors and B be a p $\times$ p positive definite matrix. Then $(b'd)^2\leq(b'Bb)(d'B^{-1}d)$
It is not that difficult to prove. I'm NOT asking how to prove it.
My question:
Consider there is a p $\times$ p identity matrix in the right hand of the Cauchy-Schwarz Inequality, that is, (b'Ib)(d'Id). Why can we turn I into a positive definite matrix so that the Inequality still remains? How to understand this fact intuitively?