Specify a linear function in terms of the least squares method approximates the set point table. Then calculate the sum of squares deviations of this linear function in given points.
xi=[1.25 2.34 4.15 5.12 6.05 6.45]; yi=[-1.3 -1.6 -2.3 -3.1 -3.8 -4.1];
I assume that the required polynomial is second-degree, and the answer is: P = -0.5467x - 0.3894
How to format following equation in Matlab?
sum = $\sum_{i=0}^{n}[p(x_{i})-y_{i}]^2$