5
$\begingroup$

let $P_0,\ldots, P_k\in \mathbb{R}^2$ be a set of points. Furthermore let $\epsilon\in \mathbb{R}$.

Now I am trying to find non-trivial lower and upper bounds for

$ \sum_{i=1}^k ||Q_i-Q_{i-1}||^2\quad \text{w.r.t.}\quad ||P_i-Q_i||\leq \epsilon\quad\text{for all} \quad i=0,\ldots,k $


I know there is the trivial bound $ \sum_{i=1}^k \left(||P_i-P_{i-1}||-2\epsilon\right)^2\leq\sum_{i=1}^k ||Q_i-Q_{i-1}||^2\leq \sum_{i=1}^k \left(||P_i-P_{i-1}||+2\epsilon\right)^2 $ (using the triangle inequality).

However, for most combinations of $P_i$, these bounds are rather weak. Do you know a better approach for getting a tight bound on that sum? Of course I could always use some numerical optimization, but then you have to trouble yourself with bracketing the minimum and maximum, which I would rather like to avoid. I would much prefer a clean geometric or analytical solution.

Thanks for your thoughts.

P.S.: I am having quite some trouble tagging this question to the right field, as it seems so elementary. You are welcome to set better tags than I did...


Edit 1

Two thoughts:

  • You can show that it is either $P_i=Q_i$ or $||P_i-Q_i||=\epsilon$ when minimizing or maximizing the sum of squares
  • You can write the problem as a quadratically constrained quadratic program. However, this seems so much trouble for such a small problem...
  • 0
    @Rahul: I did. However, as far as I can see, I get so many critical points that eliminating those would be worth another question...2011-07-01

1 Answers 1

1

The first thing you could do is bound pairs of adjacent edges, instead of just the edges one by one.

That is, let $d_i = \lVert Q_{i-1} - Q_i\rVert^2 + \lVert Q_i - Q_{i+1}\rVert^2$, so that your objective function is precisely $\frac{1}{2}\big( \lVert Q_0 - Q_1\rVert^2 + \sum_{i=1}^{k-1} d_i + \lVert Q_{k-1} - Q_k\rVert^2 \big)$. It should be possible to find good bounds on $d_i$ in terms of the positions $P_{i-1}$, $P_i$, and $P_{i+1}$. I'll come back and fill this out when I have time.

By the way, it's not true that either $P_i = Q_i$ or $\lVert P_i - Q_i\rVert = \epsilon$. For $k = 2$, let $P_0 = (-1, 0)$, $P_2 = (1,0)$, and choose any $P_1$ at a distance between $0$ and $\epsilon$ from the origin. The minimum is always attained at $Q_0 = (-1+\epsilon,0)$, $Q_1 = (0,0)$, $Q_2 = (1-\epsilon,0)$. For the maximum, on the other hand, I think it's true that $\lVert P_i - Q_i\rVert = \epsilon$, because you're minimizing a function that's convex with respect to any $Q_i$.