4
$\begingroup$

I am using the fact that $X$ and $Y$ are independent if and only if $f_X(x)f_Y(y)=f(x,y)$. So I have

$f_{X} (x)=\int_{0}^{1}f(x,y)dy\\=\int_{0}^{1}x+ydy\\=[xy+\frac{y^{2}}{2}]_{y=0}^{y=1}\\=x+\frac{1}{2}$

and by basically exactly the same math, $f_Y(y)=y+\frac{1}{2}$. Then $f_X(x)f_y(y)=(x+\frac{1}{2})(y+\frac{1}{2})=xy+\frac{1}{2}(x+y)+\frac{1}{4}\ne f(x,y)$

And hence they are not independent. But can that be right? Why would the value of X have anything to do with the value of Y? It's not like one is a function of the other.

Or have I made a simple mistake? I looked through a couple of times and I'm pretty sure my math is right...

  • 0
    Actually you can see without doing any calculations that $f(x,y) = x+y$ does not factor as the product of a function of $x$ and a function of $y$. Well, if that's not obvious to you, note that if $f(x,y) = g(x) h(y)$, $f(x,y)/f(z,y)$ would not depend on $y$. But $f(x,y)/f(z,y) = (x+y)/(z+y)$ does depend on $y$.2012-08-12

4 Answers 4

2

Note that the conditional density f(x|y) depends on y. f(x|y)=f(x,y)/f(y).

You showed that f(y)=y+1/2 and f(x,y)=x+y. So f(x|y)=(x+y)/(y+1/2)

for any 0<=x<=1 and 0<=y<=1.

It clearly is not independent of y. So knowing y does addect the probability that X is in an fixed interval about x.

  • 0
    clear but what is the difference between product and summation then? normally product can be identified in terms of summations.2012-08-12
1

No mistake here. These variables are not independent.

1

X and Y are independent as you have shown. Remember that we are talking about statistical independence. A necessary and sufficient condition for statistical independence is that the joint cumulative distribution function factors as $F_{X,Y}(x,y) = F_X(x) F_Y(y)$. If the joint PDF exists, then an equivalent condition is what you have stated.

In this problem, we can still pick values of $X$ and $Y$ independently to evaluate the joint PDF or any other function of these two random variables. But $X$ and $Y$ are still statistically dependent by definition.

As a general rule: Statistical dependence $\not\implies$ functional dependence, and functional dependence $\not\implies$ statistical dependence. A famous example of the latter is the case of sample mean and variance of $N$ independent, identically distributed (IID) Gaussian random variables . The sample variance is functionally dependent on the sample mean, but they are statistically independent.

  • 0
    You mea$n$ "X a$n$d Y are dependent", not "X and Y are independent".2012-08-12
-1

Your right. You can also check $E[XY]=E[X]E[Y]$ for the independence condition. This may be clearer because expected values are constant. Also, independence is interpreted as that even if you know the outcome $X=x$, you cannot use that to guess $Y$: $f(Y|X)=f(Y)$. For $f(x,y)=x+y$, if you know $X=x$, you have a better idea of what $Y$ will be. For example, if $X=0.1$ or $X=0.5$, the probability of $Y=0.5$ under these two conditions are different.

  • 0
    Okay, you are right. It is very well-known that the zero covariance doesn't mean independence, and $E[XY]=E[X]E[Y]$ essentially comea from the zero covariance because $Cov[X,Y]=E[XY]-E[X]E[Y].$ I leave my wrong answer as an example of a common mistake.2012-08-12