0
$\begingroup$

The following question is supposed to be very easy but I cannot think of any good example: Show the existence of real valued random variables $X_1, X_2, Y_1, Y_2$ so that the following holds.

  1. $P_{X_1}=P_{X_2}, P_{Y_1}=P_{Y_2}$,
  2. $P_{X_1+Y_1} = P_{X_2+Y_2}$,
  3. $X_2$ and $Y_2$ are independent,
  4. $X_1$ and $Y_1$ are not independent.

I tried working with $\Omega=[0,1]^2$ and the lebesgue measure, but it did not work yet.

2 Answers 2

1

Let $U$, $V$, $W$ be three independent Cauchy distributed random variables. If we set $$X_1 := Y_1 := U\qquad X_2 := V \qquad Y_2 := W,$$ then the properties 1,3,4 are automatically satisfied. Show (e.g. using characteristic functions) that property 2 is also satisfied, i.e. that $$X_1 + Y_1 = 2U$$ has the same distribution as $$X_2+Y_2 = V+W.$$

  • 0
    Not sure this will be enough to the OP, who now has to find $P_U$ such that the distributions of $2U$ and $U+V$ coincide. Yes it suffices to consider $U$ with $______$ distribution, but...2017-01-25
  • 0
    @Did Yeah, probably you are right.2017-01-25
  • 0
    Ok, this might be a solution. But isn't there an easier example that uses more common distributions than the cauchy distribution?2017-01-25
  • 0
    @user406143 The Cauchy distribution is a pretty common distribution; you can write down explicitly the density as well as the characteristic function. If you are lucky someone else will write another answer to your question and provide a simpler example.2017-01-25
  • 0
    Cauchy and Dirac are the only solutions, as can be shown by fiddling with the characteristic function of any solution.2017-01-25
  • 0
    @Did But the Dirac distribution is not a solution for this problem or am I missing something? If both $X_1$ and $Y_1$ are Dirac distributed, they are automatically independent (contradicting property 3).2017-01-25
  • 0
    Indeed, Dirac distributions solve 2U=U+V in distribution but they are not solutions of the OP's problem because they contradict property 4 (but property 3 is fine). Sorry for the ambiguous previous comment.2017-01-25
0

This answer draws on the answers to a related question Is joint normality a necessary condition for the sum of normal random variables to be normal? that I asked on stats.SE (and answered too!).

  • Let $X_2$ and $Y_2$ denote two independent standard normal random variables. Then, it is well-known that $X_2$ and $Y_2$ are jointly normal random variables and that $X_2+Y_2 \sim N(0,2)$.
  • Let $X_1$ and $Y_1$ denote two jointly continuous random variables whose joint density has value $2\phi(x)\phi(y)$ on the shaded regions shown in the diagram below (borrowed from one of the answers to the question cited above). Here, $\phi(\cdot)$ denotes the standard normal density. Note that $X_1$ and $Y_1$ are dependent random variables.

$\hspace{1.5 cm}$![enter image description here

Since $\phi(\cdot)$ is an even function of its argument, we have that for $x > 0$, \begin{align} f_{X_1}(x) &= \int_{-\infty}^{-x} 2\phi(x)\phi(y) \,\mathrm dy + \int_{0}^{x} 2\phi(x)\phi(y) \,\mathrm dy\\ &=\phi(x)\left[\int_{-\infty}^{-x} 2\phi(y) \,\mathrm dy + \int_{0}^{x} 2\phi(y) \,\mathrm dy\right]\\ &=\phi(x)\left[\int_{-\infty}^{-x} \phi(y) \,\mathrm dy + \int_{x}^{\infty} \phi(y) \,\mathrm dy + \int_{0}^{x} \phi(y) \,\mathrm dy + \int_{-x}^{0} \phi(y) \,\mathrm dy\right]\\ &= \phi(x)\left[\int_{-\infty}^{\infty} \phi(y) \,\mathrm dy \right]\\ &= \phi(x) \end{align} and similarly for $x < 0$. Thus, $X_1 \sim N(0,1)$ and so is $Y_1\sim N(0,1)$ via similar calculations. Note that $X_1$ and $Y_1$ are not jointly normal random variables. Nonetheless, it is easy to show via the same kinds of calculations exploiting the symmetry of $\phi(\cdot)$ that $X_1+Y_1 \sim N(0,2)$.

To summarize,

  1. $X_1, X_2, Y_1, Y_2$ all are standard normal random variables,
  2. $X_1+Y_1$ and $X_2+Y_2$ are zero-mean normal random variables with variance $2$,
  3. $X_2$ and $Y_2$ are independent random variables, and
  4. $X_1$ and $Y_1$ are dependent random variables