2
$\begingroup$

I am confused about the difference between standard deviation and error. When do I use which? From my lecture:

  • Standard deviation quantifies spread of data about mean
  • Standard error measures spread of all means

I actually learn this from my physics data analysis lesson. So suppose I have a set of measurements of distance ($D$) verses time ($t$) and I want to compute avg velocity:

$t_{avg}$ is calculated as $(t_1 + t_2 + t_3)/3$ (of course I used excel's AVERAGE here). Then avg velocity was $t_{avg}/D$.

Then I was asked which measurement #1-10 I think is the closest approximate the the actual velocity. So I was thinking I find the standard error and the one with the lowest is the closest approximate? Is this the reasonable thing to do? I found standard error by STDEV(t_1, t_2, t_3)/SQRT(3). Is this correct?

Heres my confusion: STDERR is defined as the spread of all means. I am calculating the spread of all measured data? Not the mean?

1 Answers 1