1
$\begingroup$

Given two continuous random variables $X$ and $Y$, suppose we know probability distributions $p_X, p_Y$ of them and $cov(X,Y)$. (Note we don't impose the independence between them.) Then can we calculate $p_{Z}(z)$ where $Z=XY$? If not, what more do we need? Can we calculate $p_Z$ if we have $P_{X,Y}(x,y)$ the joint distribution of $X$ and $Y$?

  • 0
    Are the random variables discrete or continuous ?2017-02-27
  • 0
    oh, they are continuous.2017-02-27
  • 1
    Covariance: not enough. Joint distribution: suffices, the standard approach works (for example, using a change of variables to compute the joint distribution of (Z,Y), then computing the first marginal).2017-02-27
  • 0
    @Did Thanks. Though, I'am quite new to the stuff and don't have a right reference. Where might I look for a explicit example?2017-02-27
  • 0
    In your textbook, perhaps?2017-02-27

1 Answers 1

1

You can calculate $E[XY]$ from just $\mbox{cov }(X,Y)$ and the individual $E[X]$ and $E[Y]$:

$$ E[XY] = \mbox{cov }(X,Y)+E[X]E[Y]$$

Since you can easily get $E[X]$ from $p_X$ (similarly for $Y$), the information you propose is enough to determine $E[XY]$.

But it is insufficient, in general, to determine $p_Z(z)$. Rather surprisingly, if you restrict the form of $p_{XY}(x,y)$ to a second degree expression on the unit square, and zero outside, then in fact the marginal distributions and the covariance together determine a unique joint probability function of that form. But if you relax that restriction, you can find cases that agree in marginal distributions and in covariances, but are not identical joint distributions.

  • 0
    Not so surprisingly: enumerate the free parameters and compare to the number of constraints.2017-02-27