4
$\begingroup$

If $X$ and $Y$ are random scalars, then Cauchy-Schwarz says that $$| \mathrm{Cov}(X,Y) | \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2}.$$ If $X$ and $Y$ are random vectors, is there a way to bound the covariance matrix $\mathrm{Cov}(X,Y)$ in terms of the matrices $\mathrm{Var}(X)$ and $\mathrm{Var}(Y)$?

In particular, is it true that $$\mathrm{Cov}(X,Y) \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2},$$ where the square roots are Cholesky decompositions, and the inequality is read as meaning that the right hand side minus the left hand side is positive semidefinite?

  • 0
    The variance "matrix" is actually a vector (or a matrix consisting of a single row or a single column if you prefer. So you probably want to state the product on the right in slightly different fashion, using transposes etc.2012-07-27
  • 0
    Not sure what you mean - $Var(X)$ is a square $nxn$ matrix, where $n$ is the dimension of $X$.2012-07-27
  • 0
    Thee are $n$ random variables and so $n$ variances. Could you specify what the $n^2$ elements of $\text{var}(X)$ are? Maybe you meant the covariance matrix of the $n$ random variables?2012-07-27
  • 0
    Sorry if this notation was unclear, I did indeed mean the $nxn$ matrix which has $(i,j)^{th}$ element $Cov(X_i,X_j)$.2012-07-27
  • 0
    Got something from my answer below?2012-08-20

2 Answers 2