0
$\begingroup$

Say we're given a function of x and y, f(x,y) = 2x for 0 < x < 1, x < y < x + 1, and 0 otherwise.

If we were asked either questions:

1. Find the conditional variance of Y given X = x

Or

2. Find the conditional expectation of Y given X = x

We could solve the problems by finding the conditional density function and then go on and solve by using the definitions of conditional variance/expectation.

However, since the function is of only 1 variable (x in this case), we can use the "shortcut" method and observe that: since f(x,y) has no factor of y, y must be uniform on x and x+1, so Y ~ uniform (x, x+1). Then we can apply the quick expectation and variance properties of a uniform random variable and solve questions 1 and 2.

My question:

I just don't really understand any of this logic. I mean, I can identify the if we're given a joint density function of just 1 variable, then we can use the shortcut approach stated above, but I don't really know the reasoning why, and neither the reason why "since f(x,y) has no factor of y, y must be uniform on x and x+1".

Can anyone maybe give me a explanation why we can use the shortcut since "y is uniform(x, x+1)?"

Really appreciate your time and feedback, thanks in advance.

2 Answers 2

0

Let $x\in(0,1)$ be a fixed number.


Concerning condition $X=x$ for random variable $Y$ we can define a conditional density by:$$f_Y(y\mid X=x)=\frac{f_{X}\left(x,y\right)}{\int f_{Y}\left(x,y\right)dy}$$

For $y\in(x,x+1)$ we get:$$f_Y(y\mid X=x)=\frac{2x}{\int_x^{x+1} 2xdy}=\frac{2x}{2x}=1$$

For $y\notin(x,x+1)$ we get:$$f_Y(y\mid X=x)=\frac{0}{\int_x^{x+1} 2xdy}=\frac{0}{2x}=0$$

So under condition $X=x$ you are dealing with a uniform distribution on interval $(x,x+1)$.

Equipped with this knowledge you can now determine $\mathbb E(Y\mid X=x)$ and $\text{Var}(Y\mid X=x)$.

  • 0
    I appreciate your feedback, thanks a lot for your time.2017-02-10
0

It is worthwhile to conceptualize the relationship between the random variables $X$ and $Y$ as follows: for each outcome $x$ of $X$, there is an associated random variable $Y_x$ which we define to be $$Y_x \equiv (Y \mid X = x);$$ so for example if we had observed $X = 0.23$, there is an associated random variable $Y_{0.23}$ that describes the distribution of $Y$ once we have observed this value of $X$, and that this distribution is uniform on $(0.23, 1.23)$. But why is $Y_x$ uniformly distributed at all? It is perhaps not immediately obvious from looking at the joint density $f_{X,Y}(x,y)$. The reason has to do with the nature of the conditional density $f_{Y\mid X}(y)$, or equivalently in our "associated" notation, $f_{Y_x}(y)$. Loosely speaking, once we specify $X = x$ (you can choose to think of $X$ as taking on a numeric value such as we did above with $0.23$), the conditional distribution is given by $$f_{Y \mid X}(y) = \frac{f_{X,Y}(x,y)}{f_X(x)};$$ that is to say, it is proportional to the joint distribution with respect to the variable $y$. But once we have specified $X = x \in (0,1)$, the joint distribution is simply the step function $$f_{X,Y}(x,y) = \begin{cases} 2x, & x < y < x+1 \\ 0, & \text{otherwise}, \end{cases}$$ and the conditional distribution is $$f_{Y\mid X}(y) = \begin{cases} 1, & x < y < x+1 \\ 0, & \text{otherwise}. \end{cases}$$ This is precisely the uniform distribution on $(x,x+1)$. Each outcome $X = x$ admits a distinct conditional uniform distribution $Y_x$.

The unconditional (or marginal) distribution of $Y$ is the distribution that we would observe if we chose random values of $X$ according to its own distribution, then observed $Y_x$ for each outcome. It is, in a sense, a weighted average of each $Y_x$ where the weight is proportional to the chance of observing $X = x$.

Of course, this is not the only way to conceptualize the problem. You aren't required to think of realizations of $Y$ as being generated as a consequence of realizations of $X$; indeed, the existence of the marginal distribution of $Y$ shows that you don't even need to observe $X$ to get a sense of how $Y$ will be distributed.

  • 0
    I appreciate this explanation. Thanks a lot.2017-02-10