I have a set of $M$ normally distributed random variables, $r_i$, each with an associated mean $u_i$, but the same variance $\sigma^2$. What is the variance of the average of these $M$ random variables: $\frac{\sum_{i=1}^{M} u_i}{M}$? How does the variance change as $M$ increases? What if the $M$ variables have a uniform, rather than a normal distribution, over some interval $[A, B]$?
The variance in the average of a set of normally distributed random variables
2
$\begingroup$
probability
random
normal-distribution
-
0If the means $u_i$ are _constants,_ then your statement that "the mean of the sum is the sum of the means, and that the mean is going to be normally distributed by the central limit theorem." makes no sense. The mean of the sum is a constant, and not a random variable. – 2012-08-23
1 Answers
2
Assuming the M variables are independent the average has a normal distrbution with mean equal to the average of the u$_i$s as you guessed and variance σ$^2$/M. The mean and the variance will be the same for a uniform but the average will have its distirbution on[A, B]. but if you define all the uniforms to be over the same interval [A, B] they will be IID and the distribution when the mean is appropriately normalized will converge to a normal by the central limit theorem.
-
0@DilipSarwate The appropriate normalization involves multiplying by the square root of n. It is easy to find the statement of the central limit theorem in almost any book on statistics. I agreed that my statement needed qualification. My reference to you obviously knowing had to do with the way you made your comment and not to whether or not my statement would be clear to the general audience on this site. – 2012-08-23