2
$\begingroup$

I'm trying to read about Principal Component Analysis and i'm stuck in the identity $Var[\alpha_{1}^{'}X]=\alpha_{1}^{'}\Sigma\alpha_{1}$ where $\Sigma$ is the variance-covariance matrix of $X$, $X=(X_{1},\cdots,X_{p})$ and $\{X_{i}\}$ are random variables, $\alpha_{1}=(a_{1},\cdots,a_{p})$ a coefficients vector, and $\alpha_{1}^{'}$ is the transpose. Developing the LHS of the identity we have $a_{1}^2Var(X_{1})+\cdots+a_{1}^2Var(X_{p})$ wich is not equal to the RHS in general. The identity is satisfied only when the $\{X_{i}\}$ are all independent, so i'm confuse because everywhere I've read and in the book I'am following, Jollife, Principal Component Analysis, there is no such assumption. Thanks in advance for the clarification.

1 Answers 1

1

The LHS is what you said when the random variables $X_i$ are uncorrelated. In a general case, you have to compute the variance of $\sum_{j=1}^na_jX_j$, which is equal to $\sum_{i,j=1}^na_ia_jE[X_iX_j]-\sum_{i,j=1}^na_iE[X_i]a_jE[X_j]=\sum_{i,j=1}^na_ia_j\operatorname{cov}(X_i,X_j).$

  • 0
    You're right, I forgot about the assumption of the LHS and thought it was in general. Thanks.2012-10-04