Here's my Homework Problem:
We can generalize the least squares method to other polynomial curves. To find the quadratic equation $y=a x^2+b x+c$ that best fits the points $(-1, −3)$, $(0, 0)$, $(1, −1)$, and $(2, 1)$, we first write the matrix equation $AX=B$ that would result if a quadratic equation satisfied by all four points did indeed exist. (The third equation in this system would correspond to $x=1$ and $y= −1$: $a+b+c = −1$.) We proceed by writing the normal system $A^T A X=A^T B$.
Use elementary row operations to find the equation of the quadratic equation that best fits the given four points. Enter the (exact) value of y(1) on the quadratic regression curve.
So far I have a solvable matrix using $A^T A = A^T B$
$ \left( \begin{array}{rrr} 18 & 8 & 6 \\ 8 & 6 & 2 \\ 6 & 2 & 4 \end{array} \right) \left( \begin{matrix} a \\ b \\ c \end{matrix} \right) = \left( \begin{array}{r} 6 \\ 4 \\ -3 \end{array} \right) $
Normally,
$ \left( \begin{matrix} a & b \\ c & d \end{matrix} \right)^{-1} = \frac{1}{a d - b c} \left( \begin{array}{rr} d & -b \\ -c & a \end{array} \right) \> . $
How does this law translate from $2 \times 2$ matrices to $3 \times 3$?