3
$\begingroup$

Let $\{U_k\}$ be a sequence of independent random variables, with each variable being uniformly distributed over the interval $[0,2]$, and let $X_n = U_1 U_2\cdots U_n$ for $n \geq 1$.

(a) Determine in which of the senses (a.s., m.s., p., d.) the sequence $\{X_n\}$ converges as $n\to\infty$, and identify the limit, if any. Justify your answers.

(b) Determine the value of the constant $\theta$ so that the sequence $\{Y_n\}$ defined by $Y_n = n^\theta \ln(X_n)$ converges in distribution as $n\to\infty$ to a nonzero limit.

  • 1
    Note that $X_n$ is a martingale.2011-10-13

2 Answers 2

4

Let $T_k = \log U_k$ and $Y_n = \log X_n$. Then the $T_k$ are iid random variables with a particular distribution over the ray $(-\infty,\log 2]$ that can be worked out. You can now prove whatever you want about the convergence of $Y_n$ using what you know about sums of random variables; the answers can be translated back into statements about $X_n$.

0

In a comment, I noted that $X_n$ is a martingale. Since $X_n\geq 0$, a martingale convergence theorem says that $X_n$ converges almost surely. This implies that $X_n$ also converges in distribution and in probability. That $X_n$ does not converge in mean square follows from $\mathbb{E}(X_n^2)=(4/3)^n$.

In fact, you don't need martingale theory to get this conclusion. The problem is easy if you take Greg's hint and do the second part first. We have $\log(X_n)=\sum_{i=1}^n\log(U_i)$ where the mean of $\log(U_i)$ is $\mu:=\log(2)-1<0$. The strong law of large numbers says that $n^{-1}\log(X_n)\to \mu$ almost surely (and hence in distribution).

It follows that $\log(X_n)\to-\infty$ and hence $X_n\to 0$ almost surely.