For independent Gamma random variables $G_1, G_2 \sim \Gamma(n,1)$, $\frac{G_1}{G_1+G_2}$ is independent of $G_1+G_2$. Does this imply that $G_1+G_2$ is independent of $G_1-G_2$? Thanks!
For independent Gamma random variables $G_1, G_2 \sim \Gamma(n,1)$, is $G_1+G_2$ independent of $G_1-G_2$?
3 Answers
Since a rigorous proof as well as a simulation study have been proportioned already, I thought I'd throw in a slightly $\textbf{heuristic}$ argument on why whey are not independent.
Given a realisation of $G_1-G_2$, set $Y=|G_1-G_2|$. Then we know at least that $G_1\geq Y$ or $G_2\geq Y$, such that necessarily the distribution of $G_1+G_2|Y$ is concentrated above the value $Y>0$. In contrast, the unconditional distribution of $G_1+G_2$ is concentrated on the whole $\mathbb{R}_+$.
-
1Is this not rigorous? – 2017-01-16
Let $X_1=G_1+G_2$ and $X_2=G_1-G_2$. \begin{align*} M_{X_1,X_2}(t_1,t_2) &= E[\exp(t_1(G_1+G_2)+t_2(G_1-G_2))] \\ &= E[\exp((t_1+t_2)G_1+(t_1-t_2)G_2)] \\ &= E[\exp((t_1+t_2)G_1)]E[\exp((t_1-t_2)G_2)] \\ &= (1-(t_1+t_2))^{-n}(1-(t_1-t_2))^{-n} \\ &= (1-2t_1+t_1^2-t_2^2)^{-n} \end{align*} Note that $M_{X_1,X_2}(t_1,t_2)$ cannot be factored as $g(t_1)h(t_2)$. Therefore, $X_1$ and $X_2$ are not independent.
If you want to be a little more formal, $X_1 \sim \text{Gamma}(2n,1)$ and, therefore, $M_{X_1}(t_1) = (1-t_1)^{-2n}$. Also, \begin{align*} M_{X_2}(t_2) &= E[\exp(t_2(G_1-G_2))] \\ &= M_{G_1}(t_2)M_{G_2}(-t_2) \\ &= (1-t_2)^{-n}(1+t_2)^{-n} \end{align*} Since $M_{X_1,X_2}(t_1,t_2) \neq M_{X_1}(t_2)M_{X_2}(t_2)$, conclude that $X_1$ and $X_2$ are not independent.
No, not independent. Here is a quick simulation in R statisticsl software of 100,000 realizations of $X = G_1 + G_2$ and $Y = G_1 - G_2,$ for the case $n = 4.$ You should be able to turn the central point of it into a proof. (Notice that $G_1$ and $G_2$ take only nonnegative values.)
m = 10^5; n = 4
g1 = rgamma(m,n,1); g2 = rgamma(m,n,1)
x = g1 + g2; y = g1 - g2
cor(x,y)
## 0.0009158704 # consistent with uncorrelated
By symmetry, it is no surprise that $X$ and $Y$ are uncorrelated. But for non-normal data that does not imply independence.
So we make a scatterplot of $Y$ against $X,$ from which it is immediately clear that $X$ and $Y$ are not independent. It is clear that $P(X < 5) \approx 0.13 > 0$ and $P(Y > 5) \approx .04 > 0,$ but $P(X < 5, Y > 5) = 0.$
This conclusion of association between $X$ and $Y$ agrees with @madprob's elegant proof using MGFs (+1).
mean(x < 5 & y > 5)
## 0
mean(x < 5); mean(y > 5)
## 0.13255
## 0.03955
