My question today is about the minimization of an error function with two parameters. It is a function that measures the error of a set of points. The two parameters are the weights of a regressor.
$\frac{1}{N}\sum_{t=1}^{N}[r^t-(w_1x^t+w_0)]^2$
The minimum should be calculated by taking partial derivates of the error function above with respect to $w_1$ and $w_0$. Setting them equal to $0$ and solving for the unknown. However I didn't reach the solutions given. The solutions should be:
$w_1=\frac{\sum_tx^tr^t-\sum_t\frac{x^t}{N}\sum_t\frac{r^t}{N}N}{\sum_t(x^t)^2-N(\sum_t\frac{x^t}{N})^2}$
$w_0=\sum_t\frac{r^t}{N}-w_1\sum^t\frac{x^t}{N}$
They are performing well in practice. But my question is, can I reach them by taking the partial derivatives and setting them equal to $0$? Can anybody help me, at least with one? Thank you.
UPDATE:
This is the regressor I get by using the $w_1$ and $w_0$ listed above. As you can see, the two model the data very well so they must be right.
UPDATE 2:
I will post the passage from the book that lists $w_1$ and $w_0$ as the solution. Maybe you'll get the idea better.