1
$\begingroup$

I'm trying to understand the solution to the following question:

Let X and Y be independent random variables, uniformly distributed on the interval $[0,1]$. Since $X \neq 0 $ almost surely, the random variable $Z=\frac{Y}{X}$ is well defined.

Compute $P(X < x | \sigma(Y)) $ and $ P(X < x | \sigma(Z)) $.

How do you calculate a conditional probability in the case where you are conditioning on a sigma algebra? How is the answer below obtained?

$$P(X < x | \sigma(Y)) = \min\{x,1\} $$ $$P(X < x | \sigma(Z)) = \min\{x^2,1\} I_{\{ Z \leq 1 \}} + \min\{xZ^2,1\}I_{\{ Z \geq 1\}} $$

  • 0
    Ah yes - I've corrected it now. Thanks.2017-01-19

3 Answers 3

1

Conditioning on the sigma algebra $\mathcal{G}$, a conditional probability is defined to be a $\mathcal{G}$ measurable function satisfying \begin{align} E(P(A \mid \mathcal{G}) \mathbb{1}_{G}) = P(A \cap G) \end{align} for any $G \in \mathcal{G}$.

Probabilities conditioned on sigma algebras are a bit tricky in the sense that you don't directly calculate them. Rather, you guess a $\mathcal{G}$ measurable function and verify that it satisfies the above condition. For example,

Let $G \in \sigma(Y)$. Then \begin{align*} E(\min(x,1) \mathbb{1}_G) &= E(\min(x,1)) \cdot E(\mathbb{1}_G) \\ &= (\mathbb{1}_{\{x \leq 1\}}x + \mathbb{1}_{\{x > 1\}}) \cdot P(G) \\ &= P(X < x) \cdot P(G) \\ &= P(\{X < x\} \cap G) \end{align*} where the first and last equalities follow by independence. Thus $P(X < x \mid \sigma(Y)) = \min(x,1)$.

Similarly for the second conditional probability, you need only to show that \begin{align*} E\Big((\min(x^2,1)\mathbb{1}_{\{Z\leq 1\}}+\min(x\cdot Z^2,1)\mathbb{1}_{\{Z\geq 1\}}) \mathbb{1}_G\Big) = P(\{X < x\} \cap G) \end{align*} for any $G \in \sigma(Z)$.

  • 0
    Correct me if I'm wrong, but the definition I found for the conditional probability is equivalent to this one, by the definition of conditional expectation? However, the conditional expectation definition I used does give a way of directly calculating this conditional probability, without knowing the solution in advance.2017-01-19
2

The first identity is direct since $(X,Y)$ is independent hence $P(X

To show the second identity, since every distribution involved, conditional or not, has a PDF, a rather straightforward method is to compute the conditional PDF $f_{X\mid Z}$. This requires to know the joint PDF $f_{X,Z}$ and the marginal PDF $f_Z$, then $$f_{X\mid Z}(x\mid z)=\frac{f_{X,Z}(x,z)}{f_Z(z)}$$ and, by definition, $P(X

  • 1
    (+1) Nice to see how such calculations are done. Why the downvote?2017-01-20
1

Probability of an event B given a sigma algebra $\mathcal{F}$ is a random variable defined as $$P(B|\mathcal{F}) = E(I_B|\mathcal{F})$$ where $I_B$ is the indicator function.

In your case, we would have $$P(X

To get the conditional expectation, we use the following method (described in more detail here (A.2)):

We see how the expectation of $I\{X

Now we should just replace $y$ with $Y$ in the last expression to get $E(I\{X

The second problem is quite harder. Similarly, you need to calculate $P(Xhere, to get the conditional density $f_{X|Z=z}(x)$. Then, $P(X1\}}$. Finally, replacing $z$ with $Z$ you get the wanted solution.

  • 0
    Could you go straight to $P(X \leq x | \sigma (Y)) = P(X \leq x ) $ if X and Y are independent? Also in the case of $Z$, how do you deal with the fact that it is expressed in terms X and Y? Could you give me another hint?2017-01-19
  • 0
    You can go straight to $P(X$X,Y$ independent $E(X|Y)=E(X)$) in the same manner as I did here. I'll get back to you regarding $Z$ a bit later when I have the time. – 2017-01-19
  • 0
    I'm terribly sorry, I now see that my solution was incorrect. The law of total expectation would give the expectation of the random variable we want (the conditional probability), not the variable itself. I now corrected my answer and gave method to get the variable.2017-01-19