$$F(x,y)=|x−a_0|+|x+y−a_1|+…+|x+(n−1)\times y−a_{n−1}|$$
I've already tried gradient descent/differentiation, finally came up with $0(n^2)$ solution, but it's too much: $n ~ 1000000,$ so I need better asymptotic.
$$F(x,y)=|x−a_0|+|x+y−a_1|+…+|x+(n−1)\times y−a_{n−1}|$$
I've already tried gradient descent/differentiation, finally came up with $0(n^2)$ solution, but it's too much: $n ~ 1000000,$ so I need better asymptotic.
It is easy to see that in the final solution, $x$ is a median of the series $(b_i)_{i=0}^{n-1}$, with $b_i = a_i - i y$, and $F(x,y)$ is the sum of the distances between $x$ and $b_i$. So, given $y$, you can determine the median and compute the objective in $\mathcal{O}(n)$.
Now, $y$ should be selected in a way that all $b_i$ are as close to a median as possible. I do not see an immediate answer, but since it's a univariate convex optimization problem, one solution method is bisection search to find a root of the subgradient. There are $\mathcal{O}(\log(n))$ search steps, each requires $\mathcal{O}(n)$ work, resulting in $\mathcal{O}(n\log(n))$ complexity.