Let $X_i \sim F_i$, where $F_i = G(F_{i-1})$ where there sequence begins with a known $F_0$ (and all $F_i$ are valid cdfs (monotonic increasing) defined over (0,1)).
I believe from other posts, that it is not enough to show that $\mathbb{E}X_0> \mathbb{E}X_{\infty}$ (where $X_{\infty}$ is the the random variable distributed by $F_i$ as $i \to \infty$, which we can assume is a Dirac mass function) to prove that $\mathbb{E}X_i$ decreases with $i$.
I believe the reason for this is that there could be a converging zig zag behavior, so that the iterations of $F_i$ could make the corresponding expectations go up a little, then down a little, while still converging to a fixed point.
My question is -- are there are properties of the $G$ operator that would allow you to say that there are no zigzags, and thus the above conditions are sufficient for proving the mean decreases with $i$?
Perhaps this is naive, but it would seem to me that most functions that someone like myself would encounter (read: a beginner) would not create zigzag behavior. Since G does not depend on n, it can't be raising a negative number to alternating powers or anything. And we can assume no trig operators.
Is it possible to say something like "EX_0 > EX_{\infty}, and the $G$ function only contains such-and-such type operations, therefore $EX_i$ decreases with $i$" ? And if so, what is "such-and-such" ?