6
$\begingroup$

I know quite a few identities about quadratic forms of random vectors, but I'm having difficulty coaxing something out of this quadratic form of random matrices. Suppose I know $\mathbb{E}[W W^{T}]$ and $\mathbb{E}[W] = 0$. Then can I deduce a closed for form $\mathbb{E}[W^{T} P W]$ for a non-random matrix $P$?

EDIT Changed given from $\mathbb{E}[W^{T} W]$ to $\mathbb{E}[W W^{T}]$, added $\mathbb{E}[W] = 0$.

  • 0
    Do you know the distribution of $W$ as well, or just the variance $\mathbf{E}[W^TW]$?2011-11-24
  • 0
    All I know is that E[W] = 0 and E[W * W'] = Sigma_{w}, not the distribution itself.2011-11-24
  • 1
    If you're going to fundamentally change the question by some minute changes that are easily overlooked, it would make sense to point out the change.2011-11-24
  • 0
    @Sasha: Why did you delete your answer? It seemed to solve the problem.2011-11-24
  • 0
    I entered an edit comment stating my changes, but apparently they weren't picked up.2011-11-24
  • 0
    @Sasha Let me repeat joriki's comment above.2012-09-25

1 Answers 1

1

Quite simply, you can't say much -- at least not in the sense that you can decompose it. Consider the $(i,j)$ index of $\mathbb{E}[W^{T} P W]$.

$$ \mathbb{E}[W^{T} P W](i,j) = \sum_{k1,k2}P(k1,k2) W(k1, i), W(k2, j) $$

While on the other hand,

$$ \mathbb{E}[W W^{T}](i,j) = \sum_{k} W(i,k) W(j,k) $$

To say anything about the former, what you really need is the terms of the latter. Were $W$ a vector, the covariance matrix $\mathbb{E}[W W^{T}]$ would have just that, but as it stands you can only retrieve linear combinations of those terms without a way to decompose them.