0
$\begingroup$

I am not sure what the expression of E[XY] looks like given that X and Y are random variables on a finite probability space. That's all I need help on. Thanks!

  • 1
    This gives a definition http://en.wikipedia.org/wiki/Expected_value when u have the density function of X and Y2011-09-29
  • 0
    I already looked at that. I am still a bit uncertain. Given {w_n} in the probability space, I define E[X}=\sum (X(w_n))p_n, but does E[XY] have P_n^2?2011-09-29
  • 0
    Perhaps you should look at the relevant [section](http://en.wikipedia.org/wiki/Expected_value#Non-multiplicativity) and replace the integration with a summation.2011-09-29
  • 0
    Can you please stop being so hostile? The density function is not making sense to me. I already looked at the section...2011-09-29
  • 1
    Let $W=XY$. Make a list of all the possible values taken on by $W$. With any luck you can find, from the information about $X$ and $Y$, the probability $P(W=w)$ for all possible values of $w$. After that, expectation is the standard one variable stuff. If you need further detail, please supply the specific question you have in mind. When $X$ and $Y$ are **independent** life becomes simpler, since then $E(XY)=E(X)E(Y)$.2011-09-29
  • 0
    So given that we take W(w_1) that's just X(w_1)Y(w_1)...but can it ever have X(w_1)Y(w_2)?2011-09-29
  • 0
    Give me an explicit form in the case of finite probability...2011-09-29
  • 0
    I don't know what you mean by explicit form. Maybe $\sum_i\sum_j x_i y_j P((X=x_i)\land (Y=y_j))$. But formulas like this don't make much sense until we have dealt with a number of more concrete cases. Do you have an explicit numerical question in mind? (Sorry about jargon, $\land$ means "and".)2011-09-29
  • 0
    I seriously wasn't trying to be hostile - I should've said 'could' instead of 'should' - and I really thought you didn't see the section. I apologize. Do you know what a [joint probability mass function](http://en.wikipedia.org/wiki/Joint_probability_distribution#Discrete_case) is? For a concrete example, let $X$ be the outcome of a coin flip ($0$ or $1$ with probability $1/2$ for each outcome) and define $Y=1-X$. Then $$\mathbb{E}(XY)=1\cdot1\cdot(0)+1\cdot0\cdot(1/2)+0\cdot1\cdot(1/2)+0\cdot0 \cdot (0)=0$$ because $$P(X=1,Y=1)=0, P(X=1,Y=0)=1/2,$$ $$P(X=0,Y=1)=1/2,P(X=0,Y=0)=0.$$2011-09-29

1 Answers 1

2

It is not quite clear what you are expecting but it might be something like

$$E[XY]=\sum_x \sum_y xy \Pr(X=x,Y=y) $$ $$= \sum_x \sum_y xy \Pr(X=x|Y=y)\Pr(Y=y) $$ $$= \sum_x \sum_y xy \Pr(X=x)\Pr(Y=y|X=x)$$

If they are independent then $\Pr(X=x|Y=y)=\Pr(X=x)$ and $\Pr(Y=y|X=x)=\Pr(Y=y)$ so this becomes $$E[XY]= \sum_x x \Pr(X=x) \sum_y y \Pr(Y=y) =E[X]E[Y]$$