4
$\begingroup$

I am trying to solve the following problem:

Let $(\Omega, \mathbb{A}, \mathbb{P})$ be a probability space and $X_1, X_2, \ldots, X_n$ independent real random variables. Prove that the sum $X_1 + X_2 + ... + \ldots X_n$ is $\mathbb{P}$-almost surely constant iff each $X_i$ is $\mathbb{P}$-almost surely constant.

Do you have any ideas or hints how to tackle with this problem? Thanks.

3 Answers 3

5

It suffices to show: $X+Y=c$ almost surely $\Rightarrow X$ constant almost surely where $X,Y$ are independent random variables.

Let $\xi,\eta \in \mathbb{R}$. By the independence we have

$\mathbb{E}e^{\imath \, (X,X+Y) \cdot (\xi,\eta)} = \mathbb{E}e^{\imath \, \xi \cdot X} \cdot \mathbb{E}e^{\imath \, (X+Y) \cdot \eta} = \mathbb{E}e^{\imath \, \xi \cdot X} \cdot \mathbb{E}e^{\imath \, \eta \cdot X} \cdot \mathbb{E}e^{\imath \, \eta \cdot Y}$

On the other hand

$\mathbb{E}e^{\imath \, (X,X+Y) \cdot (\xi,\eta)} = \mathbb{E}e^{\imath \, (\xi+\eta) \cdot X} \cdot \mathbb{E}e^{\imath \, \eta \cdot Y}$

Since $\mathbb{E}e^{\imath \, \eta \cdot X} \cdot \mathbb{E}e^{\imath \, \eta \cdot Y} = \mathbb{E}e^{\imath \, (X+Y) \cdot \eta} = e^{\imath \, \eta \cdot c} (\not=0)$ we have $\mathbb{E}e^{\imath \, \eta \cdot Y} \not= 0$ for all $\eta$. Hence we obtain from the first two equations

$\mathbb{E}e^{\imath \, (X,X) \cdot (\eta,\xi)} = \mathbb{E}e^{\imath \, (\xi+\eta) \cdot X} = \mathbb{E}e^{\imath \, X \cdot \eta} \cdot \mathbb{E}e^{\imath \, X \cdot \xi}$

This means that $X$ is independent of $X$ and therefore almost surely constant.

Remark In the last step we used the following theorem: Two random variables $U,V$ are independent $\Leftrightarrow \forall \xi,\eta: \mathbb{E}e^{\imath (U,V) \cdot (\xi,\eta)} = \mathbb{E}e^{\imath \, U \cdot \xi} \cdot \mathbb{E}e^{\imath \, V \cdot \eta}$


Another approach: Let $S_n := \sum_{j=1}^n X_j$. We have $0=\mathbb{V}S_n= \sum_{j=1}^n \mathbb{V}X_j$. This implies $\mathbb{V}X_j=0$ and therefore $X_j = \mathbb{E}X_j$ a.s.. The problem is that one has to show $X_j \in L^2$ (to do these calculations).

  • 0
    @takecare I don't see a nice way to prove this; however, for the first approach (i.e. using characteristic functions) we don't need square integrability.2017-01-19
2

I will prove the $\implies$ part. Assume that $X+Y$ is a constant a.s now since $\mathbb{R}=\cup_n[n,n+1]$ so we can find an interval $I_1$ such that $P(X\in I_1)>0$ by the same idea we can divide $I_1$ into two parts so we can find an interval $I_2\subset I_1$ such that $P(X\in I_2)>0$. By induction we can find a nested sequence of intervals $I_n$ such that $P(X\in I_n)>0$. Since $\cap_nI_n$ is a single point by our construction call it $a$ so $P(X=a)=\lim_nP(X\in I_n)$ so if $X$ is not a constant a.s then we can find $n_0$ such that $0. now since $X+Y$ is equal to a constant $c$ a.s so $P(X\in I_{n_0}\cap Y\notin c-I_{n_0})=0$ and since $X,Y$ are independent so $P(Y\notin c-I_{n_0})=0$ also using that $P(X\notin I_{n_0}\cap Y\in c-I_{n_0})=0$ we get $P(Y\in c-I_{n_0} )=0$ contradiction.

  • 0
    Thanks a lot for your comments. Got it.2012-11-25
0

Writeup of alternative proof, discussed here:

It is sufficient to look at the case with two RVs. The final argument follows by induction. Since the RVs are independent, we are allowed to write the probability in terms of their product measure.

Let $X,Y$ be independent and $X+Y \equiv c \ \mathbb{P}$-a.s. for some $c \in \mathbb{R}$: $\begin{align*} 1 = \mathbb{P}(X + Y = c) &= \\ \mathbb{P}(X = c-Y) &= \left( \mathbb{P}^{X} \otimes\mathbb{P}^{Y}\right) \left(\{(x,y) \in \mathbb{R}^2 \mid x=c-y\} \right)\\ &= \int \int \mathbb{1}_{\{(x,y) | x=c-y\}} d\mathbb{P}^{X} d\mathbb{P}^{Y} \\ &= \int_{\mathbb{R}} \int_{\{c-y\}} d\mathbb{P}^{X} \mathbb{P}^{Y}(y)\\ &= \int_{\mathbb{R}} \mathbb{P}(X = c-y) d\mathbb{P}^{Y}(y)\\ &\leq sup_{x \in \mathbb{R}} \mathbb{P}(X = x). \end{align*}$