1
$\begingroup$

Suppose $X_1,\ldots,X_n$ are independent random variables with $E(X_i)=\mu$ and $\operatorname{Var}(X_i)=\sigma^2$ for all $i=1,\ldots,n$. Let $S_k=X_1+\cdots+X_k$. Find $\rho(S_k,S_n)$.

So I know that $\rho(S_k,S_n)=\operatorname{Cov}(S_k,S_n)/\sigma_k \sigma_n$ and $\operatorname{Cov}(S_k,S_n)=E(S_kS_n)-E(S_k)E(S_n)$. Since $S_k=X_1+\cdots+X_k$ I assume that $S_n=X_1,\ldots,X_n$. I'm not sure how to start this.

  • 0
    Yes that was a typo.2012-11-16

2 Answers 2

1

There are significantly more efficient ways to go, but let's see how to complete your nearly complete calculation.

I assume you know how to find $\sigma_k$ and $\sigma_n$. Also $E(S_k)$ and $E(S_n)$ are no problem. So the only issue is computing $E(S_kS_n)$.

Note that $S_n=S_k+(X_{k+1}+X_{k+2}+\cdots +X_n)$. Let $Y=X_{k+1}+X_{k+2}+\cdots +X_n$. Then $S_kS_n=S_k^2+ S_kY$. Because $S_k$ and $Y$ are independent, we have $E(S_kS_n)=E(S_k^2)+E(S_k)E(Y).$ The term $E(S_k^2)$ is easy to compute, it is closely related to the variance of $S_k$. And there is no difficulty finding $E(Y)$. Now put the pieces togther.

1

Given independence, and using bi-linearity: $ \mathbb{Cov}\left(S_n, S_m\right) = \sum_{p=1}^n \sum_{q=1}^m \mathbb{Cov}\left(X_p, X_q\right) = \sum_{p=1}^n \sum_{q=1}^m \delta_{p,q} \mathbb{Var}(X_p) = \sum_{p=1}^{\min(n,m)} \mathbb{Var}(X_p) = \sigma^2 \min(n,m) $