The problem:
Recall that we saw that when $\mu$ is a probability measure on $X$ and $f$ is integrable with respect to $\mu$, then $ \exp\left(\int_X f(x)d\mu(x)\right) \leq \int_X e^{f(x)}d\mu(x).$ What can you conclude if we have equality?
I know the solution is the functions that are constant on $X$ for all but a set of measure 0, but I'm not too sure how to prove this. So far what I've done is to show that a constant function on a set of measure 1 in $X$ gives equality, but I haven't been able to come up with a way to show a function with at least two distinct values on subsets of $X$ of positive measure cannot give equality. Namely:
If we have some integrable function (with respect to $\mu$) that is constant on all but a set of measure 0, say $f(x) = c$ for all $x\in Y\subset X$, with $\mu(Y) = 1$, then the left hand side is \begin{align*} \exp\left(\int_X f(x)d\mu(x)\right) &= \exp\left(c\int_X d\mu(x)\right)\newline &= \exp\left(c\int_Y d\mu(x)\right)\newline &= \exp(c\mu(Y))\newline &= e^c. \end{align*} The right hand side is \begin{align*} \int_X e^{f(x)}d\mu(x) &= \int_X e^c d\mu(x)\newline &= e^c \int_X d\mu(x)\newline &= e^c \int_Y d\mu(x)\newline &= e^c \mu(Y)\newline &= e^c, \end{align*} and so we have $ \exp\left(\int_X f(x)d\mu(x)\right) = \int_X e^{f(x)}d\mu(x).$ These are the only possible $f$ for equality to occur. To see this, suppose $f$ were not constant (on all but a set of measure 0). Start with the simplest case, namely, suppose $X = N\cup Y\cup Z$ with $N,Y$ and $Z$ pairwise disjoint, $\mu(Y) > 0 < \mu(Z)$, and $\mu(N) = 0$ such that $f(y) = a$ for all $y\in Y$ and $f(z) = b$ for all $z\in Z$, with $a\neq b$.
Then the left hand side is \begin{align*} \exp\left(\int_X f(x)d\mu(x)\right) &= \exp\left(\int_Y a d\mu(x) + \int_Z b d\mu(x)\right)\newline &= \exp\left(a\int_Y d\mu(x) + b\int_Z d\mu(x)\right)\newline &= \exp(a\mu(Y) + b\mu(Z)). \end{align*} whereas the right hand side is \begin{align*} \int_X e^{f(x)}d\mu(x) &= \int_Y e^a d\mu(x) + \int_Z e^b d\mu(x)\newline &= e^a \int_Y d\mu(x) + e^b \int_Z d\mu(x)\newline &= e^a \mu(Y) + e^b \mu(Z). \end{align*}
So...? I'm not sure how to show $a\neq b$ implies $ \exp(a\mu(Y) + b\mu(Z)) < e^a \mu(Y) + e^b \mu(Z).$
My guess is it's something simple I'm missing, but I don't really see how to proceed.
Edit: I suppose I can add some of the fiddling around I've done.
We know $\mu(Z) = 1 - \mu(Y),$ so the original equality can be written as $ \exp(a\mu(Y) + b(1-\mu(Y))) \leq e^a \mu(Y) + e^b (1-\mu(Y))$ which means $ \exp((a-b)\mu(Y) + b) \leq (e^a - e^b)\mu(Y) + e^b.$ Hence as $ \exp((a-b)\mu(Y) + b) = \exp((a-b)\mu(Y))e^b,$ we have $ \exp((a-b)\mu(Y)) \leq (e^{a-b} - 1)\mu(Y) + 1.$
Edit some more:
I suppose one could look at this last inequality as a function of $a-b$. To make things neater, just a function of $\exp(a-b)$, so let $x = \exp(a-b)$. Then we have $ x^{\mu(Y)} \leq (x - 1)\mu(Y) + 1.$ Now we know if $a = b$ (i.e., $x = 1$) we have equality, so we can compute the derivative of both sides. The derivative of the left hand side is $ \mu(Y)x^{\mu(Y) - 1},$ whereas the derivative of the right hand side is simply $\mu(Y)$. Hence as $x$ increases the functions can never be equal again.
I think this works, though I'll sit on it and think a little more about it (and it's generalization to all non-constant functions on $X$)