2
$\begingroup$

There are three statements:

  1. $(X_1,X_2)$ is a uniform random vector over some subset $S \subset \mathbb{R}^2$,
  2. $X_1$ and $X_2$ are two uniform random variables over some $S_1 \subset \mathbb{R}$ and $S_2 \subset \mathbb{R}$ respectively,
  3. $X_1$ and $X_2$ are two independent random variables.

where $S_1$ and $S_2$ are projections of $S$ onto the two axes for $X_1$ and $X_2$ respectively.

  1. I was wondering if any two of the three statements imply the other one? I know that 3 and 2 imply 1, and suspect other implications are also true.
  2. What is the necessary (and sufficient) condition for 2 and 1 to imply each other? Is 3? Or some requirements on $S$?
  3. Also I wonder if the above true statements can be generalized to cases when there are finite random variables $(X_1, X_2, \dots, X_n)$, or even infinitely many (countably or uncountably) random variables? Is 3 required to be modified to be mutually independent, or pairwise independence will work?

Thanks!

  • 0
    If 3. is true, then I think 1. and 2. are equivalent. Otherwise neither 1. nor 2. implies the other, nor do they (even taken together) imply 3.2012-10-25
  • 0
    @mjqxxxx: How do you show that 3 and 1 imply 2? Why 2 and 1 do not imply 3?2012-10-25
  • 0
    @Tim if they are independent random variables, then the joint density is the product of marginals. In other words, if a joint density is given with the independence conditions they should be generated by two uniform marignals.2012-10-25

1 Answers 1