If the function is expected to be smooth everywhere except for a jump point, you could treat this as an inverse problem and use a smoothing regularizer.
Say there is some smooth background function $f$, to which a jump at point $p$ with height $j$ has been added, along with some noise $\xi$, yielding the observed function $g$. The forward model is,
$g=A(f,p,j) + \xi,$
where $A:L^2(0,1) \times \mathbb{R} \times \mathbb{R}$ is modeled by $A(f,p,j)(x):=\begin{cases} f(x), & x We seek to find the minimizer $(f^*,p^*,j^*)$ to the following regularized inverse problem, $\inf_{f \in L^2, p\in (0,1),j \in [0,\infty)} ||A(f,p,j)-g||^2 + \alpha||\Delta f||^2 + \gamma |j-j_0|^2.$
The first term tries to minimize the misfit $A(f^*,p^*,j^*)-g$.
The second term tries to make $f$ smooth. $\Delta$ is the laplacean and $\alpha$ is a regularization parameter you choose. The larger $\alpha$, the more $f$ is forced to be smooth.
The third term tries to make the jump a certain size. $j_0$ is about how big you expect the jump to be, and $\gamma$ is how much you penalize it if the jump is not that size.
Choose some reasonable-ish parameters $\alpha,j_0,\gamma$, replace $L^2(0,1)$ by a space of continuous piecewise linear functions on a fine enough grid, the laplacean by a second finite difference, and solve the above minimization problem with your favorite optimization method/software.