I have the model $\ s(x,y)=x^2+y^2, 0 \leq x \leq 1, 0 \leq y \leq 1 $.
Instead of observing the model directly I am observing the derivatives of the model + some noise (e):
$\ p(x,y)=s_x+e, q(x,y)=s_y+e$
From measurements of p(x,y and q(x,y) I want to estimate s(x). Say I know that s(0,0)=0.
According to the gradient theorem: $\ s(x,y) = \int_{(0,0)}^{(x,y)} [s_x,s_y]d\vec{r}$
regardless along which path we integrate.
Unfortunately the gradient theorem does not hold in this case because of the noise, as can be shown by a Matlab experiment.
Instead one is supposed to use global integration methods where one instead tries to choose a S(x,y) that minimizes:
$\ \int_0^1\int_0^1 [|s_x - p|^2 + |s_y - q|^2] dxdy$
How do I do this in practice (with my example equation)?
EDIT:
One approach that I have used is this:
first we introduce linear derivation operators: $\ s_x = D_x*s,s_y = D_y*s$.
The result is the following linear equation system:
$\ D_x*s=p, D_y*s=q$
Next find a Least Square Error solution to these equations. A LSE solution to these equations is supposed to be equivalent to minimizing the integral from above. How can this be shown?
The results are good but there are some drawbacks to this approach:
it is difficult to implement; must use central differences and "flatten" 2D s matrix into vector etc.
The derivation matrices are very large and sparse so they may consume a lot of RAM.
Another approach that seems potentially both simpler to code, less RAM consuming and faster is to use FFT. In the Fourier space these pdes become an algebraic equation. This is known as the Frankot-Chellappa algorithm, but unfortunately I have not got it to work on my example data.