0
$\begingroup$

Suppose a set of $n-1$ are given in 2D space, $x_1, x_2, \dots, x_{n-1}$, and an additional point $x_n$ is to be assigned a 2D coordinate such that the prescribed Euclidean distances $d_1, d_2, \dots, d_{n-1}$ of this points to points $x_1, x_2, \dots, x_{n-1}$ are matched as best as possible in least-squares sense.

In other words, a function to be optimized is: $$f(x_n)=\sum_{i=1}^{n-1} (d_i-\|x_i-x_n\|)^2,$$ is minimized. An approach to achieve the minimimum is to adapt some form of iterative minimizers (at least in the literature I have on disposal). However, an approach to minimize the function is to find its derivative wrt $x_n$, set it to zero,$$\frac{\partial}{\partial x_n}f(x_n)=0,$$ and solve for $x_n$. Why is such approach not applicable here? Is is due to hardness of finding a derivative (or impossibility to find one), or due to computational effort?

  • 0
    How are $d_i$ defined exactly, or are they just constants?2012-10-13
  • 0
    @HaukeStrasdat Values of $d_i$ are arbitrary positive constants, not necessarilly equal.2012-10-13
  • 1
    In the general case, there will usually be many *local* minimas. If you simply start at some point and "follow" the gradiant, chances are high that you will end at some local minimum, not the global one.2012-10-13
  • 0
    Hm, the first thing which comes to my mind is that $g_i(x_n) := (d_i-||x_i-x_n||)^2$ does not approximate zero at the minimum $(x_n,f(x_n))$ since $d_i$ are arbitrary. So it is strictly speaking not a least-squares problem and thus the Gauss-Newton method is no applicable, AFAIK. Did you try another iterative method such as gradient descent or the Newton method?2012-10-13
  • 0
    @HaukeStrasdat Well, the point is that iterative methods such as gradient descent is often used. But, why can't one just set the derivative of $f(x_n)$ wrt $x_n$ to zero, and solve for $x_n$?2012-10-13
  • 0
    A good way to see that the residuals $g_i\top g_i$ have to be small (at least around the minimum), is to derive Gauss-Newton from the more general Newton method: http://en.wikipedia.org/wiki/Gauss%E2%80%93Newton_algorithm#Derivation_from_Newton.27s_method In the Newton method the Hessian of f is: $H_f = J_g^\top J_g + H_g \cdot g$ (with $H_g$ being the second derivative of $g$). In the Gauss-Newton method (=least-squares), the Hessian is approximated as $H_f = J_g^\top J_g$ which is only valid if $g := (g_1,...,g_{n-1})$ is small. I don't know whether this is true in your case...2012-10-13
  • 0
    Notr that the derivative is undefined at $x_n = x_i$. So these are critical points and candidates for minimums.2012-10-13
  • 0
    @user506901: I was implicitly assuming you are using Gauss-Newton, but maybe it is not the case. Could you write down how your linear system looks like?2012-10-13
  • 0
    @hardmath So, the argument is that the derivative cannot be found, and therefore one has to adopt, eg, gradient descend?2012-10-13

1 Answers 1