1
$\begingroup$

When defining jointly continuous, we are saying: if there exists a function $f(x,y)$ satisfying that

$$ P\{X\in A, Y\in B\}=\int_B\int_a f(x,y)dxdy $$

then $X,Y$ are called jointly continuous.

So not every pair of random variable $X$ and $Y$ are jointly continuous, because such $f(x,y)$ may not exists, right?

Then in Ross's book Introduction to Probability Models, there is something I'm confused:

He proved the formula

$$ E[X+Y]=E[X]+E[Y] $$

for jointly continuous random variables, but then use it for arbitrary pair of random variables.

When $X,Y$ is jointly continuous, it is easy to prove, since $$ E[g(X,Y)]=\int_{\mathbb{R}}\int_{\mathbb{R}}g(x,y)f(x,y)dxdy $$

Then let $g(X,Y)=X+Y$.

To be clear, my question is :

(1) Isn't every pair of random variable $X$ and $Y$ are jointly continuous, right?

(2) The formula $E[X+Y]=E[X]+E[Y]$ is valid for any pair $X, Y$ even if they are not independent, right?

(3) Ross in his book proved $E[X+Y]=E[X]+E[Y]$ under the assumption $X$ and $Y$ is jointly continuous. It is quite easy. How to prove general case?

  • 2
    I would cancel the *mistake* sentence, if I were you, or at least rephrase it.2012-11-01
  • 0
    @did, sorry I don't know what do you mean?.. You mean I shouldn't use the word *mistake* judging Ross's argument?2012-11-01
  • 0
    Yes. $ $ $ $ $ $2012-11-01
  • 0
    yeah. maybe....2012-11-01

1 Answers 1