3
$\begingroup$

If a sequence of random variables $X_n$ converges in distribution to some r.v. $X$, the convergence of moments doesn't immediately follow. However, if the sequence is uniformly integrable, then we have the convergence of moments.

Thus, for example, if $X_n\Rightarrow X$ and \sup \mathbb{E}[|X_n|^{1+\varepsilon}]<\infty for some $\varepsilon >0$ (a sufficient condition for uniform integrability), then \mathbb{E}[|X|]<\infty and $\mathbb{E}[X_n]→\mathbb{E}[X]$. (See for example Theorem 25.12 and Corollary in Billingsley's Probability and Measure).

My situation however is this: $X_n\Rightarrow X$, and $\mathbb{E}[X_n]=\infty$.

QUESTION: Does it follow that $\mathbb{E}[X]=\infty$ too?

Let me add that all the $X_n$ and $X$ are nonnegative so their moments are defined ($\mathbb{R}_+ \cup \infty$).

The moment convergence results I've seen all invoke uniform integrability and finiteness of moments, which doesn't apply here. Is it even possible to have \mathbb{E}[X]<\infty (a counterexample would be instructive)? Or might anyone be able to suggest other additional conditions so that $\mathbb{E}[X]=\infty$?

  • 0
    It is almost as easy to construct counterexamples that converge to any nondegenerate limit you wish. Let $Y \geq 0$ with $\mathbb E Y = \infty$ as before and let $Z$ have an *arbitrary* distribution. Then, take $X_n = (1-n^{-1}) Z + n^{-1} Y$.2012-04-17

1 Answers 1

2

Let $X_n$ of density $f_n(x):=\frac{2n}{\pi(1+nx^2)}\chi_{(0,+\infty)}$; if $g$ is a continuous bounded function then $\int_0^{+\infty}(f_n(x)g(x)-f_n(x)g(0))dx\leq 2\int_0^{+\infty}\frac{|g(tn^{-1})-g(0)|}{\pi(1+t^2)}dt$ and by the dominated convergence theorem it converges to $0$. So $X_n$ converges in distribution to $\delta_0$ which have moments of each orders.

  • 0
    Ah wonde$r$ful! The counterexamples are instructive, and cardinal's was really simple too.2012-04-16