4
$\begingroup$

given a set of coordinates and the following function:

cost = $\sum \sqrt{(x_i−X)^2+(y_i−Y)^2}w_i$ I would like to find the point (X, Y) for which this function is minimal.

A simple example shows that the weighted average of the coordinates does not deliver the desired output: P1:{0, 0}, P2:{3, 0}, w1=1, w2=3 would result in {2, 0}, with a cost of 4, while the optimal solution is {3, 0} with a cost of 3.

I have a good (although bit rusty) knowledge of math, but my attempt to derive this function resulted in something to complex for me. I suspect there is some kind of trick that would simplify everything, but I cannot seem to find it.

  • 0
    Indeed, I had not came up with that myself yet. However, my problem does require 2 dimensions.2011-12-15

2 Answers 2

5

As Henry points out, the point you're asking for is the (weighted) Fermat-Weber point, also known as the geometric median. There is an algorithm that converges to the optimal solution known as Weiszfeld's algorithm: however, it's not known exactly how long it takes, and there are some technical difficulties when the iteration lands at a data point. This link describes the basic method and some modifications.

It is also known that there is no closed form solution for the problem involving radicals, so that line of attack is DOA :).

If you're willing to tolerate a (provably) approximate solution with an error guarantee as much as you wish, then there's a PTAS by Bose, Maheshwari and Morin that is relatively straightforward.

1

As I said in a comment, this is the two-dimensional equivalent of a (weighted) median, and it is not obvious that there is any suitable trick. Indeed there are good reasons to think there may not be: while the medians of a solid triangle pass through a single point, other area bisectors do not so there will not be a solution as simple as finding where two easily calculated lines intersect, as you can for the mean.

Steve Witham has a nice page looking at two-dimensional extensions of medians, with an applet which lets you see various examples. He observes that the point which minimises the sum of the absolute distances is also the point where the sum of unit vectors pointing at the original points sum to zero.

In your weighted case each of the directional vectors to the original points should be as long as the respective weight, but again you want find the point(s) where these sum to zero (or if an original point then the weighted directional vectors pointing to the other points have a vector sum with magnitude less than the weight for that point: for example a triangle where the weights are equal and one of the angles is more than $2\pi/3$: the Fermat/ point).

So one approach might be to add up these weighted directional vectors, and seek to optimise by moving in the direction of the vector sum by an amount related to its magnitude, in a form of gradient descent.