1
$\begingroup$

Let $X$ and $Y$ be random variable and $k$ be some fixed constant.

For a long time I thought that $P(X+Y=k)=P(X=k-Y)$, but recently I read something like $P(X - Y = k) = E_{Y} \Big( P(X-Y = k) \Big) = E_{Y} \Big( P(X = k+Y) \Big) = \sum_{y=0}^{\infty} P(Y=y)P(X = k+y)$ that makes me wonder whether it is common to consider as random variables probabilities involving random variables being equal to a quantity that involves a random variable itself, like $P(X=k-Y)$.

I have no background in Probability Theory from a measure-theoretic point of view, but from my naïve understanding an explanation could be that $P(X=k-Y)=P(\{\omega:\omega\in X^{-1}(k-Y)\})\neq P(\{\omega:\omega\in {(X+Y)}^{-1}(k)\})=P(X+Y=k)$ where $-1$ obviously indicates the preimage of the argument respect to the function.

In other words, in one case "$P(\cdot)$" is used as a measure of the preimage of $k$ via $X+Y$, in the other case it's used as a function of $Y$ therefore becoming itself a r.v.

Then the confusion rise because of the equal sign being improperly used in place of the set-theoretic way to write down the probability that a r.v. takes some value.

Right?

  • 0
    @MichaelHardy sorry, I rewrote $\{\omega : X(\omega)=k-Y\}$ without noticing to become redundant.2012-12-02

2 Answers 2

1

Your understanding that $\Pr(X+Y=k)=\Pr(X=k-Y)$ is correct. Formally, $\Pr(X+Y=k)=\Pr \{\omega: X(\omega) + Y(\omega) = k)\} = \Pr(X=k-Y).$

Possibly, what you read was the following (for independent $X$ and $Y$): $\Pr(X+Y=k)=\mathbb{E}_Y[\Pr_X(X + Y = k)].$ Here, $\Pr_X(X + Y = k)$ is a random variable (which is a function of $Y$), and $\mathbb{E}_Y[\Pr_X(X + Y = k)]$ is its expectation.

0

I have a suspicion that what you read said, or ought to have said, something like this: \begin{align} \Pr(X-Y=k) = \mathbb E(\Pr(X-Y=k\mid Y)) & = \sum_y \Pr(X-Y=k\mid Y=y)\Pr(Y=y) \\[15pt] & = \sum_y \Pr(X-y=k\mid Y=y)\Pr(Y=y) \\[15pt] & = \sum_y \Pr(X=k+y\mid Y=y)\Pr(Y=y) = \cdots\cdots \end{align} Google the term "law of total expectation".