I'm revising for my exams and I want to check if I did this exercise correctly:
10 measurements were done using a certain tool. The average and standard deviations of measurements using a this tool are 0.4495 and 0.014 respectively. Test, using a 5% significance level, wether or not the average measurement value deviates from the true value 0.45 and interpret the result.
What I have is:
My null-hypothesis is "the measurement doesn't deviate"
Alternative hypotheses: "The measurement deviates"
My test statistic, assuming null hypothesis, is $ T = \frac{0.4495-0.45}{0.014/\sqrt{10}} \approx -0.113 $ Now, $ |T| = 0.113 < t_{9, 0.025} = 2.262 $
This is not enough evidence to refute the null hypothesis, so it ends here?
Also, I apologise should my English be sub-par, I'm not natively English so feel free to correct me.