If $X$ and $Y$ are random scalars, then Cauchy-Schwarz says that $| \mathrm{Cov}(X,Y) | \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2}.$ If $X$ and $Y$ are random vectors, is there a way to bound the covariance matrix $\mathrm{Cov}(X,Y)$ in terms of the matrices $\mathrm{Var}(X)$ and $\mathrm{Var}(Y)$?
In particular, is it true that $\mathrm{Cov}(X,Y) \le \mathrm{Var}(X)^{1/2}\mathrm{Var}(Y)^{1/2},$ where the square roots are Cholesky decompositions, and the inequality is read as meaning that the right hand side minus the left hand side is positive semidefinite?