There is an easy way to compute the expectation of $R$, provided one makes much more precise the stochastic structure of the samples... Namely, introducing $Z_i=(X_i,Y_i)$, one should assume that the sequence $(Z_i)_{1\leqslant i\leqslant n}$ is i.i.d. And the variance is another, more complicated, story.
Introduce the random variables $S_n=X_1+\cdots+X_n$ and $T_n=Y_1+\cdots+Y_n$, hence $R_n=T_n/S_n$. Let $\mathfrak X_n$ denote the sigma-algebra generated by $(X_i)_{1\leqslant i\leqslant n}$.
To compute the expectation of $R_n$, note that, for every $i\leqslant n$, $E(Y_i\mid\mathfrak X)=E(Y_i\mid X_i)=aX_i$ hence $E(T_n\mid\mathfrak X_n)=aS_n$ and $E(R_n\mid\mathfrak X_n)=E(T_n/S_n\mid\mathfrak X_n)=E(T_n\mid\mathfrak X_n)/S_n=a$. In particular, $E(R_n)=a. $ As regards the variance, one can start from $ T_n^2=\sum_iY_i^2+\sum_{i\ne j}Y_iY_j $ and note that, for every $i\leqslant n$, $E(Y_i^2\mid\mathfrak X_n)=E(Y_i^2\mid X_i)=(b^2+a^2)X_i^2$ and, for every $i\ne j$, $E(Y_iY_j\mid\mathfrak X_n)=E(Y_iY_j\mid \sigma(X_i,X_j))=E(Y_i\mid X_i)E(Y_j\mid X_j)=a^2X_iX_j. $ This implies $ E(T_n^2\mid\mathfrak X_n)=\sum_i(b^2+a^2)X_i^2+\sum_{i\ne j}a^2X_iX_j=b^2\sum_iX_i^2+a^2S_n^2. $ Dividing by $S_n^2$, taking the expectations and using the value of $E(R_n)$, one gets $\text{Var}(R_n)=b^2E\left(\frac{U_n^2}{S_n^2}\right),\qquad U_n^2=X_1^2+\cdots+X_n^2. $ This reduces $\text{Var}(R_n)$ to an expression involving (the value of $b$ and) the marginal distribution of the $X_i$s only, but there is no general formula for $\text{Var}(R_n)$ in terms of $n$, $a$ and $b$. The exception is the case $n=1$ since $U_1^2=X_1^2=S_1^2$ hence $\text{Var}(R_1)=b^2$.
Edit:
(1) For every $n\geqslant1$, $\text{Var}(R_n)=nb^2E\left(\dfrac{X_1^2}{S_n^2}\right)\geqslant\dfrac{b^2}n$.
(2) If the $X_i$s are uniform on an interval $[0,x]$, then, for every $n\geqslant1$, $\text{Var}(R_n)=\dfrac{2b^2}{n+1}$.