Let be $ f: \mathbb{R} \rightarrow \mathbb{R}$ a continuous, monotone function. Then, if $a>0$ i must prove that the following inequality holds: $\int_{-a}^{a}xf(f(x)) \geq0$
I wonder if there is a simple proof for it.
Let be $ f: \mathbb{R} \rightarrow \mathbb{R}$ a continuous, monotone function. Then, if $a>0$ i must prove that the following inequality holds: $\int_{-a}^{a}xf(f(x)) \geq0$
I wonder if there is a simple proof for it.
We have \begin{align} \int_{-a}^axf(f(x))dx&=\int_0^axf(f(x))dx+\int_{-a}^0xf(f(x))dx\\ &=\int_0^axf(f(x))dx+\int_{a}^0(-s)f(f(-s))(-ds)\\ &=\int_0^as\left(f(f(s))-f(f(-s))\right)ds \end{align} and we deal with two cases:
(in fact $f\circ f$ is non-decreasing in any case)
We can see we have equality if and only if $f\circ f$ is even.