1
$\begingroup$

Suppose I have a one-parameter family of basis functions $b(s,x)$, and a function $f(x)$ that I know can be represented (up to some low-amplitude noise) as a finite linear combination of these functions. If the number of functions $j$ is known, finding this combination is a trivial optimization problem, for example:

$\min_{s_i, \lambda_i} \left\| f(x) - \sum_{i=1}^j \lambda_i b(s_i, x) \right\|_{L^2}$

The problem is what to do if $j$ isn't known; I want to find the smallest possible $j$ such that the minimum of the above problem is less than some tolerance. I can of course try $j=1, 2, \ldots$ and stop as soon as my minimum is within tolerance, but surely there's a better way?

EDIT: It sounds like even when $j$ is known, this problem is a lot harder than I expected, due to presence of local minima. What techniques can be used to solve it?

  • 0
    For your edit: get good estimates for your nonlinear parameters, and then you don't have to fret so much about the local minima.2011-05-01

1 Answers 1

1

This is known as the basis-pursuit problem. See the tutorials that I pointed to here: https://cstheory.stackexchange.com/questions/5608/survey-on-optimization-algorithms/8612#8612