1
$\begingroup$

In a paper, I find the expression "Let $\{X(Y_k) | Y_k\}_{k=0,1, \cdots}$ be mutually independent"

Q: What does this notation mean? Does somebody know it? Is it some kind of conditional independence? How do I have to define it well?

For every $d \in R^n$, $X(d): \Omega \to R$ is a random variable, $Y_k: \Omega \to R^n$.

  • 0
    @Didier Thanks for finding the better source for the paper :) You write only $y$, but you would still consider $\hat{\theta}_k$ in the paper to be a random variable? (after all after the initialization for $k=0$ in the definition of the subsequent $\hat{\theta}_k$ random variables turn up) [No confusion - The $\hat{\theta}_k$ in the paper plays the role of $Y_k$ here]2011-08-17

1 Answers 1

1

So each $X(Y_k)$ is the random variable $Z_k$ defined on $\Omega$ by $Z_k(\omega)=X(Y_k(\omega))(\omega)$.

About the question itself: conditional independence usually refers to conditionings with respect to a single sigma-algebra $G$. For example the random variables $U$ and $V$ are independent conditionally on $G$ iff $E(u(U)v(V)1_A)P(A)=E(u(U)1_A)E(v(V)1_A)$ for every bounded functions $u$ and $v$ and every $A$ in $G$.

The hypothesis in the paper you cite probably means that the random variables $X(y)$ are independent for different values of $y$ in the common image set of the random variables $Y_k$, where each $X(y)$ is the random variable $T_y$ defined by $T_y(\omega)=X(y)(\omega)$ for every $\omega$ in $\Omega$.