0
$\begingroup$

Is there any special technique to deal with the distribution of sum of two random variables where they are not independent?

For example I have concluded that if $X =_p W$ and $Y=_pZ$ ($=_p$ means having same distribution) then these two sum must be equal

$$\ \int_{t}P({X+Y< t

But I don't know how to do it technically. It seems to be true by intuition!

  • 1
    What do you mean by $=_p$?2012-11-02
  • 0
    means having same distributions2012-11-03
  • 0
    It would be interesting to know how you *have concluded that .../... these two probabilities must be equal*, since they are not always equal.2012-11-03
  • 0
    Actually my conclusion was $$\ \int_t P({X+Y< t and I thought that maybe the integrand are equal ( as a stronger guess!)2012-11-03
  • 0
    Briefly put, the pointwise version fails (as demonstrated by @martini) but the integrated version holds (see my answer).2012-11-03

2 Answers 2