-1
$\begingroup$

I've calculated the expected value using indicator values, and I'd like to find the variance as well.

$V(X) = E[X^2] - E[X]^2$

I need to calculate $E[X^2]$

Is there a way to get from $X_i$'s to $E[X^2]$ ?

======

EDIT: the question from the textbook is, when rolling a dice 20 times, what's the expected value of times you get 5 or 6. So, every indicator is for the i'th roll, with the expected value of 1/3. which mean E[X] is 20 * 1/3; I know this is a binomial distribution and I can get variance using np(1-p) but I'd like to do it the using the variance formula.

  • 0
    So $X_i = 1$ w.p $p$ and $X_i = 0$ w.p. $1-p$, right ? Do you also assume independence ? And I suppose that $X = \sum_i X_i$ ?2017-02-28
  • 0
    the question from the textbook is, when rolling a dice 20 times, what's the expected value of times you get 5 or 6. So, every indicator is for the i'th roll, with the expected value of 1/3. which mean E[X] is 20 * 1/3; I know this is a binomial distribution and I can get variance using np(1-p) but I'd like to do it the using the variance formula.2017-02-28
  • 0
    I don't believe there is a universal method. Indeed, it's often a lot easier to get the expected value than the variance.2017-02-28
  • 0
    That specific problem is just a Binomial distribution, with success probability $\frac 13$. The variance is well known.2017-02-28
  • 0
    @lulu I understand that, trying to wrap my head around calculating variance in the general sense.2017-02-28
  • 0
    Please use [$\rm \LaTeX$](http://meta.math.stackexchange.com/q/5020/290189) for the edited part.2017-02-28

3 Answers 3

1

For a binomial distribution, we use the bilinearity of covariance and that the indicator random variables used are for success in each of the $n$ independent Bernoulli trials operating with identical success rate $p$.

$$\begin{align}\mathsf {Var}(X) &= \sum_{k=1}^n\sum_{h=1}^n \mathsf {Cov}(X_k, X_h) \\ &= \sum_{k=1}^n\mathsf{Var}(X_k) +2\sum_{k=1}^{n-1}\sum_{h=k+1}^n \mathsf{Cov}(X_k,X_h) \\ &= n(\mathsf E(X_1^2)-\mathsf E(X_1)^2)+0\end{align}$$

Alternatively, the same result via the definition of variance.

$$\begin{align}\mathsf {Var}(X) &= \mathsf E(X^2)-\mathsf E(X)^2 \\ &= \mathsf E((\sum_{k=1}^n X_k)(\sum_{h=1}^n X_h))-(\mathsf E(\sum_{k=1}^n X_k))^2\\ & =\sum_{k=1}^n\mathsf E(X_k^2)+2\sum_{k=1}^{n-1}\sum_{h=k+1}^n\mathsf E(X_kX_h)-\sum_{k=1}^n\mathsf E(X_k)^2-2\sum_{k=1}^{n-1}\sum_{h=k+1}^n\mathsf E(X_k)\mathsf E(X_h)\\ &= n\mathsf E(X_1^2)-n\mathsf E(X_1)^2\end{align}$$

Now from the definition of expectation:

$$\begin{align}\mathsf E(X_1) &= 1\cdot\mathsf P(X_1=1)+0\cdot\mathsf P(X_1=0) \\[1ex] &= p \\[2ex]\mathsf E(X_1^2) &= 1^2\cdot\mathsf P(X_1=1)+0^2\cdot\mathsf P(X_1=0) \\[1ex] &= p\end{align}$$

Just put it together.

2

Hint:

Make use of: $$\text{Var}(\sum_{i=1}^nX_i)=\text{Covar}(\sum_{i=1}^nX_i,\sum_{j=1}^nX_j)=\sum_{i=1}^n\sum_{j=1}^n\text{Covar}(X_i,X_j)$$

If there is symmetry then you find:$$\text{Var}(\sum_{i=1}^nX_i)=n\text{Covar}(X_1,X_1)+n(n-1)\text{Covar}(X_1,X_2)=$$$$n\text{Var}X_1+n(n-1)\text{Covar}(X_1,X_2)$$

If moreover the $X_i$ are uncorrelated then $\text{Covar}(X_1,X_2)=0$ and you find:$$\text{Var}(\sum_{i=1}^nX_i)=n\text{Var}X_1$$

1

Since the variables are independent you have : $$ Var\left( \sum_{i=1}^{20} X_i \right) = \sum_{i=1}^{20} Var(X_i) = \sum_{i=1}^{20} (E[X_i^2] - E[X_i]^2) = \sum_{i=1}^{20}(1/3 - 1/9) $$