4
$\begingroup$

If $X$ and $Y$ are random scalars, then Cauchy-Schwarz says that $$| \mathrm{Cov}(X,Y) | \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2}.$$ If $X$ and $Y$ are random vectors, is there a way to bound the covariance matrix $\mathrm{Cov}(X,Y)$ in terms of the matrices $\mathrm{Var}(X)$ and $\mathrm{Var}(Y)$?

In particular, is it true that $$\mathrm{Cov}(X,Y) \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2},$$ where the square roots are Cholesky decompositions, and the inequality is read as meaning that the right hand side minus the left hand side is positive semidefinite?

  • 0
    The variance "matrix" is actually a vector (or a matrix consisting of a single row or a single column if you prefer. So you probably want to state the product on the right in slightly different fashion, using transposes etc.2012-07-27
  • 0
    Not sure what you mean - $Var(X)$ is a square $nxn$ matrix, where $n$ is the dimension of $X$.2012-07-27
  • 0
    Thee are $n$ random variables and so $n$ variances. Could you specify what the $n^2$ elements of $\text{var}(X)$ are? Maybe you meant the covariance matrix of the $n$ random variables?2012-07-27
  • 0
    Sorry if this notation was unclear, I did indeed mean the $nxn$ matrix which has $(i,j)^{th}$ element $Cov(X_i,X_j)$.2012-07-27
  • 0
    Got something from my answer below?2012-08-20

2 Answers 2

4

There is a generalization of Cauchy Schwarz inequality from Tripathi http://web2.uconn.edu/tripathi/published-papers/cs.pdf that says that: \begin{equation} \mathrm{Var}(Y) \ge \mathrm{Cov}(Y,X)\mathrm{Var}(X)^{-1}\mathrm{Cov}(X,Y) \end{equation} in the sense that the diference is semidefinite positive. He actually says that a student asked about it and couldn't find any other reference (1998!).

  • 0
    @Did Would this inequality hold if you had $ \mathrm{Cov}(X,Y) \le \mathrm{Var}(X)^{1/2} (\mathrm{Var}(Y)^{1/2})^t$ please notice the transpose after the $\mathrm{Var}(Y)^{1/2}$2014-01-08
  • 0
    http://math.stackexchange.com/questions/631994/cauchy-schwarz-inequality-for-random-vectors2014-01-09
  • 0
    When equality holds?2016-10-27
2

Assume that $X=Y$ has variance matrix $LL^T$. Then one asks whether $|x^*Cx|\leqslant x^*L^2x$ for every deterministic vector $x$, with $C=LL^T$. This cannot hold. A specific counterexample valid in every dimension at least 2 is when $L_{ij}=1$ if $j=1$ and $L_{ij}=0$ otherwise. Equivalently, $C_{ij}=1$ for every $(i,j)$, that is, $X_i=X_0$ for every $i$, for some standard normal $X_0$.