5
$\begingroup$

This problem can be found here, which is a previous prelim exam problem of UT Austin.

Let $X$ and $Y$ be two independent random variables with $X+Y \in L^1$. Show that $X\in L^1$.

Generally, in real analysis, $f+g\in L^1$ does not imply $f\in L^1$, so I guess this must have something to do with their independence.

I guess it might be something like $EX=E(X+Y|Y=y)-y$, but I'm not sure whether I can write like that without knowing $X\in L^1$ or $Y\in L^1$ first.

Or, it should be proved in another way?

Could you please help? Thanks.

  • 0
    I might be wrong, since I'm very rusty at probability, but I think you should express density of $X+Y$ as a convolution of densities of $X$ and $Y$, and then apply Fubini.2012-12-23
  • 1
    @tomasz they do not necessarily have densities, but you can do something similar.2012-12-23

1 Answers 1

3

Recall that $X$ and $Y$ are independent if and only if the joint measure $P_{X,Y}$ is the product measure $P_X \otimes P_Y$. The statement $X + Y \in L^1$ implies that we can apply Fubini's theorem to the integrand $$\int |X + Y |dP_{X,Y} = \int |X + Y| dP_X \otimes dP_Y = \int \int |X + Y| dP_X dP_Y$$ to find that for almost every $Y$ slice of $X + Y$ we have that $|X + y|$ is in $L^1$, and it is not hard to see that this implies that $X$ itself is in $L^1$.

  • 0
    Could you explain $dP_{X,Y}$? Should X and Y in the same probability space and the integration is $\int|X(w)+Y(w)|dP$? Thanks.2012-12-23
  • 1
    @Roun suppose you could prove the result when the probability space is $[0, 1] \times [0, 1]$, $X$ depends only the first coordinate and $Y$ depends only on the second, with $P_{X, Y}$ equal to Lebesgue product measure. If the result holds in that case, would you be able to extend the result to an arbitrary probability space? When we are studying probability theory we are often able to justifiably abstract away the underlying probability space in that our results only depend on the properties of the random variables, so if we can prove it for one space we can prove it for all.2012-12-23
  • 0
    @guy: Thanks for your comments. I'm not familiar enough with how measure theory is used in probability theory. Is that the case that we can associate every random variables (or random vectors) on a probability space with a derived measure? And when $X$ and $Y$ are independent, the derived measure for random vector $(X,Y)$ happens to be the product measure of the derived measures of $X$ and $Y$?2012-12-23
  • 1
    Yes, that's true. Each random variable induces a pushforward measure on $(\mathbb R, \mathcal B)$, and the product of two pushforward measures on $\mathbb R^2$ can be used to model independent random variables, where $X$ and $Y$ are now coordinate functions. This is slightly different than what I was suggesting, but I think it works the same.2012-12-23
  • 0
    @guy: OK, I see, and the pushforward measure is called distribution right? Thank you so much. :)2012-12-23
  • 0
    @Roun whenever we talk about two random variables, there is assumed to be some probability space in the background, which comes equipped with a joint distribution of $X$ and $Y$. This is the $dP$ you are used to seeing. In terms of measure theory, independence says that $dP$ decomposes as a product measure where each term in the product is the marginal distribution of one of the variables. There is always the canonical representation of this measure that guy mentioned (which we can work with here if we want to) on the Lebesgue space induced by the distribution function.2012-12-23
  • 0
    @ChrisJanjigian: Thank you, I got it. :)2012-12-24