According to wikipedia, "let $f$ denote a real-valued, continuous and strictly increasing function on [0, c] with c > 0 and f(0) = 0. Let $f^{−1}$ denote the inverse function of $f$. Then, for all a ∈ [0, c] and b ∈ [0, f(c)],"
$ab \le \int_0^a f(x)\,dx + \int_0^b f^{-1}(x)\,dx$
I am wondering: does the inequality reverse if $f$ is decreasing, and $f(a) =0$ ? $($and, specifically with $a = b$, if that is necessary$)$
EDIT: I am pretty sure it neither holds nor reverses, as I can find examples of both. However, I don't understand why it doesn't hold. Consider $a=b=1$.
If I take an increasing function $f$, where $f(0) =0$ and $f(1) = 1$, everything is fine. But if I redefine $f$ as $f_{new}(x) = f(1-x)$ then the function is decreasing and the inequality falls apart.. even though the integral of $f_{new}$ equals that of the old, and I would think the same must be true for the inverses.