There are three statements:
- $(X_1,X_2)$ is a uniform random vector over some subset $S \subset \mathbb{R}^2$,
- $X_1$ and $X_2$ are two uniform random variables over some $S_1 \subset \mathbb{R}$ and $S_2 \subset \mathbb{R}$ respectively,
- $X_1$ and $X_2$ are two independent random variables.
where $S_1$ and $S_2$ are projections of $S$ onto the two axes for $X_1$ and $X_2$ respectively.
- I was wondering if any two of the three statements imply the other one? I know that 3 and 2 imply 1, and suspect other implications are also true.
- What is the necessary (and sufficient) condition for 2 and 1 to imply each other? Is 3? Or some requirements on $S$?
- Also I wonder if the above true statements can be generalized to cases when there are finite random variables $(X_1, X_2, \dots, X_n)$, or even infinitely many (countably or uncountably) random variables? Is 3 required to be modified to be mutually independent, or pairwise independence will work?
Thanks!