0
$\begingroup$

In my book, it is mentioned that, for $X_1, ..., X_n: \Omega \rightarrow \Bbb R$ being random variables such that $P_{X_1} = P_{X_k}$ for $k = 1, ..., n$, we have that

$$Var(X_1) + ... + Var(X_n) = n Var(X_1),$$

which would mean that $Var(X_1) = Var(X_k)$ for $k = 1, ..., n$. It seems that having the same probability makes the random variables have the same variance, but I cannot remember why this is the case. Can anyone help me out?

  • 0
    Hint (a strong one): How to compute the variance of each $X_k$?2017-02-25
  • 0
    Well, as far as I remember, we have that $Var(X_k) = E(X^2_k) - (E(X_k))^2.$2017-02-25
  • 0
    Good, and now, how to compute $E(X_k^2)$, say?2017-02-25
  • 0
    $E(X^2_k) = \sum_{\omega \in \Omega} X^2_k(\omega) P(\omega)$?2017-02-25
  • 1
    This would work for finite sample spaces $\Omega$ only. Here, you need a more serious definition of $E(X^2)$, I am afraid, one that involves $P_X$..2017-02-25
  • 0
    I also know something like $E(X) = \int_{\Bbb R} t dP_X(t)$.2017-02-25
  • 1
    Excellent, and $E(X^2)=\int_\mathbb R$ $__$ $dP_X(t)$, hence the variance of $X$ only depends on $___$ and you are done.2017-02-26
  • 0
    Perhaps write and post an answer?2017-02-26

0 Answers 0