1
$\begingroup$

I'd like to learn to evaluate how much there is error if I compute algebraic expressions and round my intermediate steps. For example, I had data of length of couples height as (167,183), (165,165), (167,178), (163,173), (178,180), (165,173). I computed the Pearson correlation of those data points to be $\frac{\frac{85}{5}}{\sqrt{\frac{287}{10}}\sqrt{41}}.$ But if I make a table to compute the correlation which has say columns $x,y,x-\bar{x},y-\bar{y},(x_i-x)(y_i-y), (x_i-x)^2,(y_i-y)^2$ and for every point I round the result for example to four decimals, what can I say the error $\epsilon$ of the correlation coefficient? Or if $\epsilon$ should be for example less than $0.01$, how many digits should I take to the intermediate results. I would like to see some tutorial/lecture notes/book to evaluate errors. Also, if person's height is 167 in this data, the height $h$ actually satisfies the inequality $166.5 so I'd like to learn how much that affects on the correlation.

1 Answers 1

0

You might look up "interval arithmetic".