1
$\begingroup$

When defining jointly continuous, we are saying: if there exists a function $f(x,y)$ satisfying that

$ P\{X\in A, Y\in B\}=\int_B\int_a f(x,y)dxdy $

then $X,Y$ are called jointly continuous.

So not every pair of random variable $X$ and $Y$ are jointly continuous, because such $f(x,y)$ may not exists, right?

Then in Ross's book Introduction to Probability Models, there is something I'm confused:

He proved the formula

$ E[X+Y]=E[X]+E[Y] $

for jointly continuous random variables, but then use it for arbitrary pair of random variables.

When $X,Y$ is jointly continuous, it is easy to prove, since $ E[g(X,Y)]=\int_{\mathbb{R}}\int_{\mathbb{R}}g(x,y)f(x,y)dxdy $

Then let $g(X,Y)=X+Y$.

To be clear, my question is :

(1) Isn't every pair of random variable $X$ and $Y$ are jointly continuous, right?

(2) The formula $E[X+Y]=E[X]+E[Y]$ is valid for any pair $X, Y$ even if they are not independent, right?

(3) Ross in his book proved $E[X+Y]=E[X]+E[Y]$ under the assumption $X$ and $Y$ is jointly continuous. It is quite easy. How to prove general case?

  • 0
    yeah. maybe....2012-11-01

1 Answers 1

2

Is there $X$ and $Y$ such that they are both continuous, but not jointly continuous? It is what I want to ask in the question (1).

Sure: take $X=Y$, where $X$ is any random variable whose distribution has a density. Then, $\mathbb P((X,Y)\in D)=1$, where $D=\{(x,x)\mid x\in\mathbb R\}$ has Lebesgue measure zero, hence the distribution of $(X,Y)$ has no density.

The formula $E[X+Y]=E[X]+E[Y]$ is valid for any pair $X, Y$ even if they are not independent, right?

Right.

How to prove [it in the] general case?

One may first prove that for every random variable $Z$ and every measurable function $u$ such that $u(Z)$ is integrable, $ \mathbb E(u(Z))=\int u(z)\,\mathrm d\mathbb P_{Z}(z). $ Assume this key relation is known and apply it to $Z=(X,Y)$ and $u:(x,y)\mapsto x+y$. This yields $ \mathbb E(X+Y)=\int (x+y)\,\mathrm d\mathbb P_{(X,Y)}(x,y)=(*). $ By linearity of the integrals with respect to the measure $\mathbb P_{(X,Y)}$, $ (*)=\int x\,\mathrm d\mathbb P_{X,Y}(x,y)+\int y\,\mathrm d\mathbb P_{(X,Y)}(x,y). $ Apply the key relation backwards to the functions $v:(x,y)\mapsto x$ and $w:(x,y)\mapsto y$. This yields $ (*)=\mathbb E(v(X,Y))+\mathbb E(w(X,Y))=\mathbb E(X)+\mathbb E(Y). $

  • 0
    Then under this condition, how to prove $E[X+Y]=E[X]+E[Y]$?. There is not joint distribution function $f(x,y)$..2012-11-02