0
$\begingroup$

Let $X,Y$ are two random variables which are not necessarily independent. It is easy to get $\mathbb{E}(X)$ ann $\mathbb{E}(Y)$. I want to know: is there some approximation to $\mathbb{E}(\frac{X}{Y})$?

[Update] The background is that I want to calculate the expectation of Pearson product-moment correlation coefficient - $\mathbb{E}(\rho_{xy})$.

The Pearson product-moment correlation coefficient is $\rho_{xy} = \frac{Cov_{xy}}{\sigma_x\sigma_y}$

It is easily to get $\mathbb{E}(Cov_{xy})$ and $\mathbb{E}(\sigma_x\sigma_y)$, so I want to know a approximation to get $\mathbb{E}(\rho_{xy})$ from the above two value.

  • 0
    I _have_ looked at the question and answer on stats.SE. There the issue was _estimating_ the variance, covariance, correlation coefficient, etc. given $n$ i.i.d. samples from $(X,Y)$ with unknown joint distribution, and you got answers for that. **Here** you are **not** talking about $n$ i.i.d. samples, or estimating $Cov_{xy}$ from such samples, etc. but just about two random variables $X$ and $Y$. I ask again, _what_ is the meaning of $E[Cov_{xy}]$?2012-01-15

1 Answers 1

2

If $X$ and $Y$ have a joint density $f(x,y)$, $E[X/Y] = \int_{-\infty}^\infty \int_{-\infty}^\infty \frac{x}{y} \ f(x,y)\ dx\ dy$ (assuming that converges absolutely). Similarly, if they have a joint probability mass function $p(x,y)$, $E[X/Y] = \sum_x \sum_y \frac{x}{y} p(x,y)$.

Let $\mu = E[Y]$. If you can treat $Y - \mu$ as small compared to $\mu$ (in particular if there is $c>0$ such that $c < Y < 2 \mu - c$ almost surely) then $E[X/Y] = \sum_{j=0}^\infty (-1)^j \frac{E[X (Y - \mu)^j]}{ \mu^{j+1}}$ so you could use a partial sum of that as an approximation.