0
$\begingroup$

Pretty basic question I'm sure:

I have a 100 histograms, they are all created from randomly sampling the same data 100 times.

Now, I want to show the 'average histogram' with markers at +1 standard deviation.

I know how to compute the average - just add bin-wise and divide by the number of histograms.

But for the standard deviation: do I use all the bin-heights from all 100 histograms to calculate the standard deviation, or do I use the bin-heights from my 'average histogram' to calculate it?

thanks.

1 Answers 1

0

The (sample) standard deviation is defined as $\hat{\sigma} =\sqrt{\overline{x^2}-\overline{x}^2}$. Normally you would calculate the average and standard deviation for each bin. For each bin, $\overline{x}$ is the value in that bin in the average histogram, while $\overline{x^2}$ is the average of the squares of the values in that bin over all the histograms.

  • 0
    I have taken the liberty of correcting a couple of typos in this answer which might have been confusing.2011-08-22