5
$\begingroup$

Let $X$ and $Y$ be positive random variables such that $E(Y\mid X)= aX $ $\operatorname{Var}(Y\mid X) = b^2X^2 $ $a,b > 0 \text{ are constants}.$ Let $R = \dfrac{\bar{Y}}{\bar{X}}=\dfrac{\sum_{i=1}^nY_i}{\sum_{i=1}^nX_i}$.

Is there an easy way to solve for $E(R)$ and $\operatorname{Var}(R)$ using the info above, without having to take the Taylor series expansion of $\frac{\bar{Y}}{\bar{X}}$ around $\mu_X,\mu_Y$?

2 Answers 2

3

There is an easy way to compute the expectation of $R$, provided one makes much more precise the stochastic structure of the samples... Namely, introducing $Z_i=(X_i,Y_i)$, one should assume that the sequence $(Z_i)_{1\leqslant i\leqslant n}$ is i.i.d. And the variance is another, more complicated, story.


Introduce the random variables $S_n=X_1+\cdots+X_n$ and $T_n=Y_1+\cdots+Y_n$, hence $R_n=T_n/S_n$. Let $\mathfrak X_n$ denote the sigma-algebra generated by $(X_i)_{1\leqslant i\leqslant n}$.

To compute the expectation of $R_n$, note that, for every $i\leqslant n$, $E(Y_i\mid\mathfrak X)=E(Y_i\mid X_i)=aX_i$ hence $E(T_n\mid\mathfrak X_n)=aS_n$ and $E(R_n\mid\mathfrak X_n)=E(T_n/S_n\mid\mathfrak X_n)=E(T_n\mid\mathfrak X_n)/S_n=a$. In particular, $E(R_n)=a. $ As regards the variance, one can start from $ T_n^2=\sum_iY_i^2+\sum_{i\ne j}Y_iY_j $ and note that, for every $i\leqslant n$, $E(Y_i^2\mid\mathfrak X_n)=E(Y_i^2\mid X_i)=(b^2+a^2)X_i^2$ and, for every $i\ne j$, $E(Y_iY_j\mid\mathfrak X_n)=E(Y_iY_j\mid \sigma(X_i,X_j))=E(Y_i\mid X_i)E(Y_j\mid X_j)=a^2X_iX_j. $ This implies $ E(T_n^2\mid\mathfrak X_n)=\sum_i(b^2+a^2)X_i^2+\sum_{i\ne j}a^2X_iX_j=b^2\sum_iX_i^2+a^2S_n^2. $ Dividing by $S_n^2$, taking the expectations and using the value of $E(R_n)$, one gets $\text{Var}(R_n)=b^2E\left(\frac{U_n^2}{S_n^2}\right),\qquad U_n^2=X_1^2+\cdots+X_n^2. $ This reduces $\text{Var}(R_n)$ to an expression involving (the value of $b$ and) the marginal distribution of the $X_i$s only, but there is no general formula for $\text{Var}(R_n)$ in terms of $n$, $a$ and $b$. The exception is the case $n=1$ since $U_1^2=X_1^2=S_1^2$ hence $\text{Var}(R_1)=b^2$.

Edit:

(1) For every $n\geqslant1$, $\text{Var}(R_n)=nb^2E\left(\dfrac{X_1^2}{S_n^2}\right)\geqslant\dfrac{b^2}n$.

(2) If the $X_i$s are uniform on an interval $[0,x]$, then, for every $n\geqslant1$, $\text{Var}(R_n)=\dfrac{2b^2}{n+1}$.

  • 0
    I'm only interested in the general case, but thank you for taking the time to explain everything in such great detail!2012-01-10
0

I'm doing the case for $n=1$ just for ease of notation. For the expectation, just condition on $X$: $E[R] = E[E[R|X]] = E[a] = a$ For the variance, we have: $Var[R] = E[Var[R|X]] + Var[E[R|X]]$$E[Var[R|X]] = E[b^2X] = b^2E[X]$ $Var[E[R|X]] = Var[a] = 0$ *For clarity: to move from 1 to $n$, now just condition on the vector valued $(X_1, X_2, X_3, \ldots X_n)$, where I'm assuming $X_i \sim X, Y_i \sim Y$. Everything works the same.

  • 0
    @Dider Piau: Hum...if we assume independence: $Var(\frac{Y_1 + Y_2}{X_1 + X_2} | X_1, X_2) = $$Var\frac{Y_1}{X_1 + X_2} + Var\frac{Y_2}{X_1 + X_2} = \frac{1}{(X_1 + X_2)^2} (Var[Y_1 | X_1, X_2] + Var[Y_2 | X_1, X_2]) = $$\frac{1}{(X_1 + X_2)^2}b^2(X_1^2 + X_2^2)$So yeah, I get exactly what you get, hey :D?2012-01-10