1
$\begingroup$

Suppose I have a one-parameter family of basis functions $b(s,x)$, and a function $f(x)$ that I know can be represented (up to some low-amplitude noise) as a finite linear combination of these functions. If the number of functions $j$ is known, finding this combination is a trivial optimization problem, for example:

$$\min_{s_i, \lambda_i} \left\| f(x) - \sum_{i=1}^j \lambda_i b(s_i, x) \right\|_{L^2}$$

The problem is what to do if $j$ isn't known; I want to find the smallest possible $j$ such that the minimum of the above problem is less than some tolerance. I can of course try $j=1, 2, \ldots$ and stop as soon as my minimum is within tolerance, but surely there's a better way?

EDIT: It sounds like even when $j$ is known, this problem is a lot harder than I expected, due to presence of local minima. What techniques can be used to solve it?

  • 0
    The magic words are "sparsity" and "regression". If you really want sparsity, it helps to have a richer, over-complete dictionary.2011-05-01
  • 0
    It's not quite clear to me that this question is well-posed. Is the set of basis functions finite? How do you know that the $\min$ is attained?2011-05-01
  • 0
    What are your $b(s_i,x)$ like? A nonlinear least-squares problem can display a crapload of multiple minima for, say, trigonometric or exponential fitting problems. Do you even have good initial estimates for your $s_i$ (there are algorithms like "variable projection" that only need estimates for the *nonlinear* parameters $s_i$ and derive estimates of the *linear* parameters $\lambda_i$)?2011-05-01
  • 0
    @J.M. Good point. Let's say that the bs are Gaussians centered at s_i of some fixed standard variation.2011-05-01
  • 1
    *crapload*: That must be a technical term.2011-05-01
  • 0
    @cardinal The set is infinite. I may need smoothness conditions on f and the bs, but won't the minimum converge to 0 as j->\inf under appropriate such conditions?2011-05-01
  • 0
    "$s_i$ of some fixed standard variation" - so you know the $s_i$ in advance?2011-05-01
  • 0
    @J.M. No, I don't. $s_i$ is a typo and should have just been $s$ (the software will no longer let me edit the comment.)2011-05-01
  • 0
    Then before you start worrying about an algorithm, you have to worry first about getting good initial estimates for the $s$ of your Gaussian basis functions.2011-05-01
  • 0
    For your edit: get good estimates for your nonlinear parameters, and then you don't have to fret so much about the local minima.2011-05-01

1 Answers 1

1

This is known as the basis-pursuit problem. See the tutorials that I pointed to here: https://cstheory.stackexchange.com/questions/5608/survey-on-optimization-algorithms/8612#8612