I am looking for a solution to the following curve fitting problem.
Given a sample of data points $\{ (x_i, y_i)_{i=1}^n \}$:
$$\underset{f}{\arg\min} \sum_{i=1}^n (y_i - f(x_i))^2 + \lambda \int_{x_{(1)}}^{x_{(n)}} f''(x)^2 dx,$$
where $f(x)$ is constrained to be a polynomial of fixed degree $p\,$ ($\ge 2$), that is $ f(x) = c_0 + c_1 x^1 + ... + c_p x^p $. And $\lambda\,$ ($\ge 0$) is fixed as well.