Assuming that the results $(U_n)_{n\geqslant1}$ of the successive random uniform selections are i.i.d., one gets $X_n=U_1\cdots U_n$ for every $n\geqslant1$, where $(U_k)_{k\geqslant1}$ is i.i.d. with uniform distribution on $(0,1)$.
Thus, $\log X_n=\sum\limits_{k=1}^n\log U_k$ is a sum of i.i.d. increments. If $\log U_1$ is integrable, the law of large numbers says that $\log X_n=n\mathrm E(\log U_1)+o(n)$ almost surely, in other words $\frac1n\log X_n\to\mathrm E(\log U_1)$ almost surely.
Now you must simply evaluate $\mathrm E(\log U_1)$...
Edit Here is an example showing that non independent choices can make things go awry. Assume that for each nonnegative integer $k$, $U_n$ has the same value $V_k$ for every positive $n$ such that $2^{k-1}< n\leqslant2^{k}$. Assume furthermore that $(V_k)_{k\geqslant1}$ is i.i.d. uniform on $(0,1)$. Hence each $U_n$ is uniform on $(0,1)$ but $U_n$ and $U_m$ are not independent unless $n$ and $m$ are separated by a power of $2$.
Then $\log X_{2^k}=\sum\limits_{i=1}^k2^{i-1}\log V_i$ hence $\frac1{2^k}\log X_{2^k}=\sum\limits_{i=0}^{k-1}2^{-i-1}\log V_{k-i}$. For large values of $k$, this sum behaves like $ W_k=\sum\limits_{i\geqslant0}2^{-i-1}\log V_{k-i}, $ which is a Markov chain with dynamics $W_{k}=\frac12(W_{k-1}+\log V_k)$. It is a relatively standard exercise to show that $(W_k)_k$ is recurrent in the sense that it visits every open interval in $(-\infty,0)$ infinitely often, almost surely. In particular, $(W_k)$ diverges almost surely hence the sequence of general term $\frac1n\log X_n$ diverges almost surely.