This problem can be found here, which is a previous prelim exam problem of UT Austin.
Let $X$ and $Y$ be two independent random variables with $X+Y \in L^1$. Show that $X\in L^1$.
Generally, in real analysis, $f+g\in L^1$ does not imply $f\in L^1$, so I guess this must have something to do with their independence.
I guess it might be something like $EX=E(X+Y|Y=y)-y$, but I'm not sure whether I can write like that without knowing $X\in L^1$ or $Y\in L^1$ first.
Or, it should be proved in another way?
Could you please help? Thanks.