0
$\begingroup$

In my Physics intro/data analysis lecture, theres some mention of "Linear least squares fits":

Gradient

$m = \frac{1}{\Delta} (N \sum x_i y_i - \sum x_i \sum y_i)$

$\sigma_m = \sqrt{N \frac{\sigma^2 y}{\Delta}}$

y-intercept

$c = \frac{1}{\Delta} (\sum x_{i}^2 \sum y_i - \sum x_i \sum x_i > y_i$

$\sigma \sqrt{\frac{\sigma^2 y}{\Delta} \sum x_{i}^{2}}$

Uncertainty for measured y

$\sigma_y = \sqrt{\frac{1}{N-2} \sum (y_i - mx_i -c)^2}$

$\Delta = N \sum x_{i}^{2} - (\sum x_i)^2$

Coefficient of determination

$r^2 = \frac{(N \sum x_i y_i - \sum x_i \sum y_i)^2}{(N \sum > x_{i}^{2} - (\sum x_i)^2) (N \sum y_i^2 - (\sum y_i)^2)}$

  • $r^2 = 0$: no corelation
  • $r^2 = 1$: perfect correlation

Main question is whats $\Delta$.

And if its simple, it'll be good to have a little understanding about how the formulas are defined. Just something very rough or a simple explaination

1 Answers 1

1

$\Delta$ is called sample variance please have a look at :

http://en.wikipedia.org/wiki/Correlation_and_dependence

if $x$ is the true value and $y$ is the estimate, least squares fitting fits each $y$ to $x$ such that the mean squared error between $x$ and $y$ is minimized.

in order to determine the $y_i$ one needs to take the derivative of the objective function and make it equal to zero. vector $y$ is the solution of this minimization.

gradient corresponds to this derivation operation.

uncertainty for measured y indicated the quality of the fit as it gives the error between the true values $y$ and the linear fit $mx_i+c$

coeffficient r is gives you another measure of the goodness of the fit. If you have $r^2=1$ then this corresponds to perfect fit!

  • 0
    @JiewMeng when you look at the second $r_{xy}$, there are two terms in the denominator in the square root. Pick one of them. Lets sat the one which is $\sqrt{n\sum_i x_i^2-(\sum_i x_i)^2}=\sigma$ and what you have is exactly $\sigma^2=\Delta$2012-08-26