Let's say I take a ton of measurements of the diameter of a marble with a 0.001" resolution micrometer, and I calculate the standard deviation of the sample set. Would the standard deviation be a better measure of uncertainty than the resolution of the instrument?
Which is a better measure of uncertainty: standard deviation or resolution uncertainty?
-
1Better for what purpose? – 2012-09-17
2 Answers
The resolution on the meter only tells you that you can't clearly resolve measurements observed between tick marks. It does not tell you how consistently the results are recorded. Repeated measurement of an object of a known length will directly measure the recording error variability through the estimated standard deviation. This will not be exact. It is subject to sampling variability (i.e. it is a sample estimate of a population parameter).
The standard deviation is a measure of the accuracy which includes meaasurement precision and bias in recording whereas instrument resolution only measures precision. So it can understate accuracy. The sample variance will be statistically an unbiased estimate of variance which does measure accuracy.
The standard deviation tells you how precise the measurement is. It tells you what the energy of the measurement error is.
If you have actual bounds on the magnitude of the measurement error, you can use interval arithmetic. I love interval arithmetic, but it can produce over-pessimistic bounds.