6
$\begingroup$

I know quite a few identities about quadratic forms of random vectors, but I'm having difficulty coaxing something out of this quadratic form of random matrices. Suppose I know $\mathbb{E}[W W^{T}]$ and $\mathbb{E}[W] = 0$. Then can I deduce a closed for form $\mathbb{E}[W^{T} P W]$ for a non-random matrix $P$?

EDIT Changed given from $\mathbb{E}[W^{T} W]$ to $\mathbb{E}[W W^{T}]$, added $\mathbb{E}[W] = 0$.

  • 0
    @Sasha Let me repeat joriki's comment above.2012-09-25

1 Answers 1

1

Quite simply, you can't say much -- at least not in the sense that you can decompose it. Consider the $(i,j)$ index of $\mathbb{E}[W^{T} P W]$.

$ \mathbb{E}[W^{T} P W](i,j) = \sum_{k1,k2}P(k1,k2) W(k1, i), W(k2, j) $

While on the other hand,

$ \mathbb{E}[W W^{T}](i,j) = \sum_{k} W(i,k) W(j,k) $

To say anything about the former, what you really need is the terms of the latter. Were $W$ a vector, the covariance matrix $\mathbb{E}[W W^{T}]$ would have just that, but as it stands you can only retrieve linear combinations of those terms without a way to decompose them.