5
$\begingroup$

If $X$ and $Y$ are random scalars, then Cauchy-Schwarz says that $| \mathrm{Cov}(X,Y) | \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2}.$ If $X$ and $Y$ are random vectors, is there a way to bound the covariance matrix $\mathrm{Cov}(X,Y)$ in terms of the matrices $\mathrm{Var}(X)$ and $\mathrm{Var}(Y)$?

In particular, is it true that $\mathrm{Cov}(X,Y) \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2},$ where the square roots are Cholesky decompositions, and the inequality is read as meaning that the right hand side minus the left hand side is positive semidefinite?

  • 0
    Got something from my answer below?2012-08-20

2 Answers 2

5

There is a generalization of Cauchy Schwarz inequality from Tripathi http://web2.uconn.edu/tripathi/published-papers/cs.pdf that says that: \begin{equation} \mathrm{Var}(Y) \ge \mathrm{Cov}(Y,X)\mathrm{Var}(X)^{-1}\mathrm{Cov}(X,Y) \end{equation} in the sense that the diference is semidefinite positive. He actually says that a student asked about it and couldn't find any other reference (1998!).

  • 0
    When equality holds?2016-10-27
2

Assume that $X=Y$ has variance matrix $LL^T$. Then one asks whether $|x^*Cx|\leqslant x^*L^2x$ for every deterministic vector $x$, with $C=LL^T$. This cannot hold. A specific counterexample valid in every dimension at least 2 is when $L_{ij}=1$ if $j=1$ and $L_{ij}=0$ otherwise. Equivalently, $C_{ij}=1$ for every $(i,j)$, that is, $X_i=X_0$ for every $i$, for some standard normal $X_0$.