3
$\begingroup$

I need to prove that $\hat{β}_1 = \dfrac{\sum_{i=1}^nx_iy_i}{\sum_{i=1}^nx_i^2}$. I have not seen this as a definition for $\hat{β}_1$ before and am having trouble even starting this proof, but it must have something to do with the least-squares normal equations and the least-squares estimators.

Any thoughts?

  • 0
    Please note that this is not an encouraged behaviour to [delete a question](http://math.stackexchange.com/questions/2118403/least-squares-hat%CE%B2-1-problem) that has been put on hold in order to post it back without having changed anything.2017-01-30
  • 0
    In this case primary fault may lie with those who voted to close the first try without understanding the (pretty clearly stated) issue at hand. See the offset parag. in my Answer. I try to resist voting to close questions on topics with which I am not really familiar.2017-01-30

1 Answers 1

2

Denote the data by $(X_i, Y_i),$ for $i = 1, 2, \dots, n.$ The least-squares line is $\hat Y = \hat\beta_0 + \hat\beta_1 X_i.$ You need to minimize $Q = \sum_i(Y_i - \hat Y_i)^2.$

To do this set the partial derivative of $Q$ with respect to $\hat \beta_1$ equal to $0.$ and solve for $\hat \beta_1$ in terms of $X_i$ and $Y_i.$ That is: $\frac{\partial Q}{\partial \hat \beta_1} = 0.$

Please check the context of this exercise carefully. It may be that your text has defined $x_i = (X_i - \bar X)$ and $y_i = (Y_i - \bar Y).$ This convention is especially common in the UK, Australia, and New Zealand.

Notes: When finding $\frac{\partial Q}{\partial \hat \beta_1}\!:\,$ (1) The data $X_i$ and $Y_i$ are treated as constants. (2) $\hat \beta_0$ is also treated as a constant. These comments are obvious, but in my experience temporarily forgetting (1) or (2) accounts for the majority of errors in the minimization procedure.