- Given a random vector $X: (\Omega, \mathbb{F}, P) \rightarrow (\prod_{i \in I} S_i, \prod_{i \in I} \mathbb{S}_i)$, is each component variable $X_i, \forall i \in I$ of the random vector $X$ always a random variable from $(\Omega, \mathbb{F}, P)$ to $(S_i, \mathbb{S}_i)$?
If yes, I guess there are two ways to define the distributions for each component variable $X_i, i \in I$:
- First the random vector $X$ induces a probability measure $P_X$ from the its domain to its codomain. Then define $P_{X_i} (A): = P_{X}(A \times \prod_{j \in I, j \neq i} S_j), \forall A \in \mathbb{S}_i$.
- $X_i$ can induce a probability measure P'_{X_i} from its domain to its codmain.
I was wondering if $P_{X_i}$ and P'_{X_i} are always the same on $\mathbb{S}_i$?
Will the first definition of marginal probability measures make the product of all the marginal probability measures to be the same as the joint measure?
Can the above definitions and their relation be generalized to arbitrary measure space $(\Omega, \mathbb{F}, \mu)$ and mearuable mapping $f:(\Omega, \mathbb{F}) \rightarrow (\prod_{i \in I} S_i, \prod_{i \in I} \mathbb{S}_i)$, to define the distribution of each component mapping $f_i $ of $f$?
Does the first way of definition require that the measure $\mu$ must be a probability measure? How to adjust the definition when $\mu$ can be any measure? Shall one define $\mu_{f_i}$ from $\mu_f$ as $\mu_{f_i}(A_i):= \frac{\mu_f(A_i \times \prod_{j \in I, j\neq i} S_i)}{\prod_{j \in I, j\neq i} \mu_{f_j}(S_i)}$, or $\mu_{f_i}(A_i):= \mu_f(A_i \times \prod_{j \in I, j\neq i} S_i)$, $\forall A_i \in \mathbb{S}_i $ or something else?
When $\mu$ can be any measure, if it is not possible to define in the first way, or it is possible but the two definitions are not the same, how about when $\mu$ is finite, i.e. $\mu(\Omega) < \infty$?
Thanks and regards!