1
$\begingroup$

consider a set of real-valued continuous functions $f_i(t;a)$ for $1\leq i\leq n$ with some shared non-negative parameters $a=(a_1,\ldots,a_d)\in\mathbb{R}^d$. Furthermore, assume that there exists for each $1\leq i\leq n$ a set of non-negative data $y^{(i)}:=\{y_1^{(i)},\ldots,y_m^{(i)}\}\in\mathbb{R}^d$ Now, I want to estimate the parameters $a$. Usually, it should suffice to numerically compute $\min_{a>0}\sum_{i=1}^n\sum_{j=1}^m\left(y^{(i)}_j-f_i(t_j;a)\right)^2$ where the $t_j$ in $f_i(t_j;a)$ corresponds to the time point, at which $y_j^{(i)}$ was recorded.

The problem is, that for different $i$ the data ranges in which all elements of $y^{(i)}$ lie, vary in orders of magnitude, e.g. all elements of $y^{(1)}$ are in the interval $[0,50]$, while the elements of $y^{(2)}$ are in the interval $[0,1500]$. This causes that $f_2$ shows a very good fit to $y^{(2)}$, while data with a much smaller data range is fit very poorly.

My first idea was to consider the relative distance between data and function at each point and fit the data with respect to

$\min_{a>0}\sum_{i=1}^n\sum_{j=1}^m\frac{\left(y^{(i)}_j-f_i(t_j;a)\right)^2}{y^{(i)}_j+1}$ where the $+1$ causes the denominator to be non-zero. But since a lot of data points $y^{(i)}_j$ are zero, there is a big penalty for the distance between function and data point in this case, but only a small if $y^{(i)}_j$ is large.

Does anybody have a good suggestion for this issue or maybe can refer to literature on this problem?

  • 0
    @fgp: Yes, but the size $n$ of the sample behind each $y_k^{(i)}$ is with 5 very small. So it is safe to assume that exponential decay is$a$good approximation of reality in this case.2012-10-13

0 Answers 0