I'm trying to work out an upper bound for the following problem, but I'm making very little progress. Hopefully, someone will be able to make a suggestion.
The integral I'm attempting to bound is:
$I = \int_{0}^{\infty} f(x) \left( g(x) - \hat{g}(x) \right) dx$
Here, $f(x)$ is a cumulative distribution function, and so is monotonically increasing on $[0, \infty]$, $g(x)$ is a probability density function, such that $\int_{0}^{\infty} g(x) dx = 1$, and $\hat{g}(x)$ is an approximation to $g(x)$, so as $\hat{g}(x) \rightarrow g(x)$, $I \rightarrow 0$.
I would like to derive some bound $I^{*}$, so that $I^{*} \geq I$ (or $I^{*} > I$), i.e. an error bound on the effect of the mismatch between $g(x)$ and $\hat{g}(x)$. I've looked at some general integral inequalities (Cauchy-Schwarz, Holder, Minkowski, etc.), but with no luck so far. So, my question is this: based on the properties of $f(x)$, $g(x)$ and $\hat{g}(x)$ outlined above, are there any further techniques I can use to upper bound the integral? To be really demanding, I'd love a form along the lines of $I^{*} = c - k \int_{0}^{\infty} (g(x) - \hat{g}(x)) dx$, where $c$ and $k$ are constant with respect to $x$, but any tips on how to tackle this would be great.
I can go into more detail on the exact functions I'm using, if necessary, but I thought it best to keep it general for now.
Thanks,
Donagh