3
$\begingroup$

I read in this answer that:

If covariance matrix is $\Sigma$, the covariance after projecting in $u$ is $u^T \Sigma u$.

I fail to see this, how do I get the covariance of a set of points after projecting those points along the direction $u$ as a function of $u$ and $\Sigma$ ?

  • 3
    More generally, if $X\in\mathbb{R}^n$ and $Y\in\mathbb{R}^m$ are random vectors and $\operatorname{cov}(X,Y)=\Sigma\in\mathbb{R}^{n\times m}$, and $A\in\mathbb{R}^{k\times n}$ and $B\in\mathbb{R}^{m\times\ell}$ are constant (i.e. non-random) matrices, then $\operatorname{cov}(AX,BY)=A\Sigma B^T\in\mathbb{R}^{k\times \ell}$. More tersely, $\operatorname{cov}(AX,BY)=A(\operatorname{cov}(X,Y))B^T$.2012-07-23

1 Answers 1

6

The covariance matrix for a vector quantity $x$ is $\langle xx^\top\rangle-\langle x\rangle\langle x^\top\rangle$. The covariance for the projection $u^\top x$ is

$$\langle u^\top xx^\top u\rangle-\langle u^\top x\rangle\langle x^\top u\rangle=u^\top\langle xx^\top\rangle u-u^\top\langle x\rangle\langle x^\top\rangle u=u^\top\left(\langle xx^\top\rangle-\langle x\rangle\langle x^\top\rangle\right)u\;.$$

The point is basically that you can pull $u$ out of all the expectation values because it's a constant.

  • 0
    By $\langle . \rangle$ do you mean expectation? i.e. when you say $\langle xx^\top\rangle$ do you mean $E\left[xx^\top\right]$ ?2012-07-23
  • 1
    @roseck: Yes.${}$2012-07-23
  • 0
    @user815423426: The $$ denotes the [inner product](http://en.wikipedia.org/wiki/Dot_product). In this case it is a normal dot-product between vectors.2013-05-08
  • 1
    @user2155919: No. a) The notation uses angled brackets $\langle\cdot\rangle$ (which you can produce using `\langle` and `\rangle`, respectively), not less/greater symbols $\lt\cdot\gt$. b) You rightly placed a comma between $u$ and $v$ in the notation for the inner product; note that there are no commas in my post. c) I had already replied to the OP's question that their interpretation of the angled brackets as denoting expectation was correct. The dot products are not explicitly reflected in the notation and arise through the matrix multiplication implied by juxtaposition.2013-05-08
  • 0
    @joriki Isn't $u^Tx$ a scalar? Shouldn't the projection of a vector $x$ onto a unit vector $u$ be $(u^Tx)x$?2016-01-06
  • 1
    @Shobhit: I'd inferred from the OP's formulation "the covariance after projecting in $u$ is $u^T \Sigma u$" that the term "projection" as used in the question refers to the scalar length of what you're referring to as the "projection" (since otherwise it wouldn't have a scalar covariance). As far as I'm aware, both of these uses of the term "projection" are in common use.2016-01-07
  • 0
    @joriki makes sense. However, allow me to be nit-picky by pointing out that your answer seems to use the formula for covariance of a vector $x$, but $u^\top x$ is a scalar. You should rather use $Cov(s) = \langle s^2 \rangle - \langle s \rangle\langle s \rangle$ for a scalar $s$, which again reduces to $\langle u^\top xx^\top u\rangle-\langle u^\top x\rangle\langle x^\top u\rangle$ for $u^\top x$ because $(u^\top x)^2=(u^\top x)(x^\top u)$2016-01-07