The question is:
Assuming that $y_i = \mu + \epsilon_i $,$i = 1,\ldots,n$ with independent and identically distributed errors $\epsilon_i$ such that $E[\epsilon_i] = 0$ and $Var[\epsilon_i] = \sigma^2$, find the least squares estimator of $\mu$. Find its variance.
I'm not sure how to go about doing this.
I know that the least squares bit means that I minimize the sum of the errors, and so I would have to use the formula:
$\sum_i (y_i - \mu)^2$
and then differentiate (wrt to $\mu$?) and then let it equal 0.
Is that correct?
Once I've done this, I would I calculate its $E[\mu]$, because I don't have any definition for $\mu$. Or is $\mu = \beta_0 + \beta_1 \cdot x_i$? If it is, then isn't the estimator the same?