I am looking for a method to solve (if it is possible) the following problem
Find the best increasing function $f: \mathbb{R} \rightarrow \mathbb{R}$ fitting the conditions
$$f(x_i)^2 - f(x_j)^2 = c_{ij}, \tag{1}$$ in terms of $\mathcal{l}^2$ norm. Assume that $x_i, x_j,c_{ij}$, with $i,j \in \Omega \subset \mathbb{N},$ are known.
A candidate tool to tackle this is the isotonic regression, but we should know the values of $f(x_i), ~\forall i \in \Omega$, which is not the case. Moreover, the method should be adapted to take the constraints into account.
I have also tried to rewrite the problem as a functional optimization problem by assuming a differentiate function $f$ such that $f'(x) \geq 0$ and, thus, use some tool such as the Euler-Lagrange equation to deal with, but I stuck in the formulation of the problem.
Insted of $(1)$, the problem can be generalized for any conditions $$g_n\left(f(x_i),f(x_j), f(x_k),\ldots\right) = 0,~~n=1,\ldots N.$$ and I am curious to know what kind of mathematical tool can be employed in these problems.
I appreciate any help. Thanks in advance!