2
$\begingroup$

$\min_{x\geq 0}\sum_{i=1}^n (a_i-x b_i)^2 [a_i-x b_i\leq 0],\quad a_i,b_i\in\mathbb R,n\in\mathbb N$

where $[p]$ is an Iverson bracket.

The objective function seemed easy (convex).

1.Is there any the concrete name for this kind of optimization or specific method to solve?

2.Any reference material?

3.To find the solution, is there suitable package in R?

  • 0
    The derivative of your function is piecewise linear, with corners at $x_j = \frac{a_j}{b_j}$. At each such corner, the slope of $f'(x)$ increases by $2b_i^2$. A few application of the cumsum function plus a linear interpolation should allow you to find the answer.2011-07-24

1 Answers 1

2

For a convex function $F$ of one variable you can always find the minimum by simple bisection: if you know that the minimum is attained on $[a,b]$ (in your case $a=0$ and the initial $b$ can be found going to the right geometrically until the current value is greater than the previous one), then you can split $[a,b]$ into 4 equal parts by the points $a and look at the values at those points to choose a smaller interval.

If $F(a), switch to $[a,x]$, else

if $F(x), switch to $[a,y]$, else

if $F(y), switch to $[x,z]$, else

if $F(z), switch to $[y,b]$, else

switch to $[z,b]$.

This approach is somewhat crude but quick and once you are close enough to the minimum, you can find it exactly because you will know which brackets are involved.

  • 0
    How to compute the gradient? Is there way to get the derivative vector of each coordinate. Please tell me the details, I really appreciate it and curious about it.2011-07-25