4
$\begingroup$

I was wondering, if you have a sequence of probability measures $(\mu_n)_n$ on $\mathbb R$ and you know that there is a probability measure $\mu$ such that for all $k\in\mathbb N=\{0,1,2,\cdots\}$ $ \lim_{n\rightarrow\infty}\int x^kd\mu_n(x)=\int x^kd\mu(x), $ does it imply that for any continuous and bounded function $f$ you have $ \lim_{n\rightarrow\infty}\int f(x)d\mu_n(x)=\int f(x)d\mu(x) \qquad ? $ And if no, what if $\mu$ has compact support ?

EDIT : I'm kind of lost : I understand reading the answers that it is somehow necessary that $\mu$ is characterized by its moments, but on an other hand, I come up with this proof, where I don't see what's wrong with. Could you help ?

Since $\mu_n$ converges towards $\mu$ in moments, we have by density of the polynomials in $C_c(\mathbb R)$ that for any $h\in C_c(\mathbb R)$ $ \lim_{n\rightarrow\infty}\int h(x)d\mu_n(x)=\int h(x)d\mu(x). $ Now, let $f\in C_b(\mathbb R)$, take any $h\in C_c(\mathbb R)$ satisfying $0\leq h \leq 1$, and write $ \left|\int f(x)d\mu_n(x)-\int f(x)d\mu(x)\right| $ $ \leq \left|\int f(x)d\mu_n(x)-\int f(x)h(x)d\mu_n(x)\right|+\left|\int f(x)h(x)d\mu_n(x)-\int f(x)h(x)d\mu(x)\right|+\left|\int f(x)h(x)d\mu(x)-\int f(x)d\mu(x)\right|. $ Thus, using $\lim_n\mu_n(\mathbb R)=\mu(\mathbb R)$ (i.e the convergence of the "$0$-th moment"), we obtain $ \limsup_n\left|\int f(x)d\mu_n(x)-\int f(x)d\mu(x)\right|\leq 2\|f\|_{\infty}\int(1-h(x))d\mu(x). $ Finally, given $\epsilon >0$, one can choose $g$ such that $ \int(1-h(x))d\mu(x)\leq \frac{\epsilon}{2\|f\|_{\infty}}, $ I have the impression that I obtain the result... Where's the mistake ?

Thanks in advance !

  • 0
    Looking back a$t$ my notes it looks like Carleman's condition is only sufficient, by the way. Sorry about that.2012-04-06

4 Answers 4

2

There are some essential details missing in the other arguments, so I'll try to fill them in. The basic outline of this argument is to use the "subsequence trick" to show that every subsequence of $\mu_n$ has a further subsequence which converges to $\mu$: we will pass to an arbitrary subsequence, then use Helly's theorem to extract a further subsequence which converges in distribution (by tightness). The idea here will be to show that the given moment condition allows us to show that all of these subsequences must converge to the same measure. As has previously been commented, this entire argument only makes sense if $\mu$ is determined by its moments, so I make that assumption. Several counterexamples to the claim are given in the other answers if you do not make that assumption.

Take such a subsequence (I will use $\mu_n$ itself for ease of notation) $\mu_n \to \mu$. It is a fact (see for example Durrett 3.2.2) that we may without loss of generality work with a sequence $X_n \to X$ almost surely where the law of $X_n$ is $\mu_n$ and the law of $X$ is $\mu$.

Let $M$ be given and large. By the bounded convergence theorem for each $k$ $EX_{n}^{k}1_{{|X_{n}|\leq M}}\to$$EX^{k}1_{{|X|\leq M}}.$ By the Cauchy-Schwarz and Markov inequalities $E|X_{n}^{k}1_{\{|X_{n}|>M\}}|\leq\sqrt{\frac{EX_{n}^{2k}E|X_{n}|}{M}}$.

This part is possibly superfluous, but I think it is worth showing that if the moments converge as sequences of real numbers (this assumption is necessary for the argument here, by the way--I need to bootstrap off of higher moments to get convergence), then the target random variable has finite moments. Since $EX_{n}^{2k}$ and $EX_{n}$ converge as sequences of real numbers (remember, $k$ is fixed here), there exists a uniform bound $K$ for which $\sup_{n}EX_{n}^{2k}E|X_{n}| so that $\sup_{n}E|X_{n}^{k}1_{\{|X_{n}|>M\}}|<\sqrt{\frac{K}{M}}$. Taking $M\geq K$ we have $\sup_{n}EX_{n}^{k}\leq2M^{k+2}+1$ Then by Fatou's Lemma, since $EX^{k}\leq\liminf_{n}EX_{n}^{k}<\infty$ it follows that $EX^{k}$ exists.

Noticing that $EX_{n}^{k}1_{\{|X_{n}|\leq M\}}\to EX^{k}1_{\{|X|\leq M\}}$ we have $\limsup_{n}|EX_{n}^{k}-EX^{k}|\leq\sup_{n}EX_{n}^{k}1_{{|X_{n}|\geq M}}+EX1_{{|X|\geq M}}$where this last inequality holds for each $M$. Notice that $X1_{\{|X|\geq M\}}$ is dominated by $|X|$ and $X1_{\{|X|\geq M\}}\to0$ pointwise, so that $\lim_{M\to\infty}EX1_{\{|X|\geq M\}}=0$ by the dominated convergence theorem. Taking limits in $M$ and using $E|X_{n}^{k}1_{\{|X_{n}|>M\}}|\leq\sqrt{\frac{K}{M}}$ uniformly in $n$ we then obtain $\limsup_{n}|EX_{n}^{k}-EX^{k}|=0$.

This shows that every subsequence of $\mu_n$ (the original sequence) has a further subsequence which converges to some measure and that the moments of that measure must agree with the moments of $\mu$. Since there exists a topology in which convergence corresponds to weak convergence of measures, this implies that $\mu_n \to \mu$ weakly (and therefore in distribution by tightness).

3

Take two distinct probability measures $\mu$ and $\nu$ with the same moments, as in this MathOverflow answer. Now consider the tight sequence $\mu,\nu,\mu,\nu,\mu\dots.$

  • 0
    Thanks for your int$e$res$t$ing remark. Bu$t$ how do you know if a measure is characterized by its moments or not ? (see my Edit)2012-04-06
2

Edit: this answer has been expanded upon further consideration.

The answer is yes if $\mu$ is the only measure with the moments $\int x^k d\mu$. Otherwise, there are examples with the lognormal distribution that show the result can be false (see Durrett's probability book). So assume that $\mu$ is the only measure with the aforementioned moments. In the comments, we concluded that for a compact set the answer is true. When we have arbitrary support, the key notion that is required is tightness. It is a fact that the cdf's $F_n$ converge to a cdf $F$ iff they are tight. Thus, it is sufficient to check that the $\mu_n$ are tight which will imply what you desire by the Levy-Cramer Continuity Theroem. To check for tightness, I refer you to the following result (which can be found in more detail in Durrett's probability book):

If there is a $g\geq 0$ such that $g(x)\rightarrow\infty$ for $|x|\rightarrow\infty$ and

$L:=\sup_n \int g(x)dF_n <\infty$

then $F_n$ is tight. The proof is trivial by considering for large enough $A$:

$1-F_n(A)-F_n(-A) \leq \frac{L}{\inf_{|x|\geq M}g(x)}$

  • 0
    @Chris: thank you for mentioning this, for some reason I thought that one of the assumptions was that the target measure has unique moments. I realize now this was actually a comment!2012-04-05
2

Partial answer, further details need to be added.

The result may be not true if we don't assume that $\mu$ is determined by its moments, and it's not automatic since we have counter-examples (a famous one is given by Feller, using the density of a log-normal law.

But if $\mu$ is determined by its moments, we can get the result. Since $\mu_n\{x:|x|\geq A\}\leq \frac 1{A^2}\int x^2d\mu_n(x)$ is bounded, the sequence $\{\mu_n\}$ is tight. Therefore, we can extract a converging (in law) subsequence, to $\nu$. What we have to show is that for this subsequence $\{\mu_{n_k}\}$, we have for all $p\geq 0$ and $m>0$ that $\lim_{k\to\infty}\int_{|x|\leq m} x^pd\mu_{n_k}(x)=\int_{|x|\leq m} x^pd\nu(x).$ Then we will get that $\mu$ and $\nu$ have the same moments.

  • 0
    Thanks for your answer. Could you elaborate for the measures which are not characterized by their moments ? (see my Edit)2012-04-06