Pretty basic question I'm sure:
I have a 100 histograms, they are all created from randomly sampling the same data 100 times.
Now, I want to show the 'average histogram' with markers at +1 standard deviation.
I know how to compute the average - just add bin-wise and divide by the number of histograms.
But for the standard deviation: do I use all the bin-heights from all 100 histograms to calculate the standard deviation, or do I use the bin-heights from my 'average histogram' to calculate it?
thanks.