I need to minimize the following function:
$f(x)= \sum_{i=1}^{i=n} \sqrt{(x-a_i)^2+b_i}.$
where $a_i>0$ and $b_i>0$ for every $i \in \{1,\ldots, n\}$.
Thank you for your help.
I need to minimize the following function:
$f(x)= \sum_{i=1}^{i=n} \sqrt{(x-a_i)^2+b_i}.$
where $a_i>0$ and $b_i>0$ for every $i \in \{1,\ldots, n\}$.
Thank you for your help.
If $b_i=0$, $x$ is a median of $a_1,\dots,a_n$ (minimizer of the sum of distances). If all $b_i$ are equal and $b_i$ tends to infinity, $x$ converges towards the mean of $a_1,\dots,a_n$ (minimizer of the sum of squared distances).
However I don't think there is a formula that describes the general case. You'll have to solve for the equation $0=f'(x)=\sum_{i=1}^n \frac{x-a_i}{\sqrt{(x-a_i)^2+b_i}}$ (when all $b_i\ne 0$, $f$ is differentiable at $x$)