[Spoiler: it's not true (see comments)]
I believe the following to be true, but am not sure how to prove it:
$ \mathbb{E}_1[x] > \mathbb{E}_2[x] \implies \mathbb{E}_1 [g(x)] > \mathbb{E}_2 [g(x)] $ when $g$ is an increasing, convex function. (We can also restrict ourselves to $x \geq 0$ for my problem.) By the subscripts $1$ and $2$, I refer to two different probability distributions over x.
In the continuous case, we'd have
$ \int x f_1(x) dx > \int x f_2(x) dx \implies \int g(x) f_1(x) dx > \int g(x) f_2(x) dx $
for two pdfs $f_1$ and $f_2$.
Question: Is my conjecture right, and if so, what's the proof?
What I've tried: We might like to write it this way:
$ \int x (f_1(x) - f_2(x)) dx > 0 \implies \int g(x) (f_1(x) - f_2(x)) dx > 0 $
I've tried using integration by parts to rewrite it in terms of $g^{\prime}(x)$ and the cdfs, but I wasn't able to carry that through. It seems promising, but I wasn't sure how to use it.
I also thought about assuming that $g(x)$ is analytic and using the Taylor series, but again, I'm not sure where to take that either.
Sidenote: It would also be cool to show whether $g$ has to be strictly increasing everywhere for this to be true.
Thanks for any advice!!