3
$\begingroup$

Show that if the sequence $X_1, X_2,\ldots$ converges to a constant $\theta$ in probability then it converges to $\theta$ in distribution

Can't seem to figure it out. Any assistance would be greatly appreciated!

  • 1
    It's worth noting that the converse is also true.2011-10-31

1 Answers 1

6

A characterization of convergence in distribution of $(X_n)$ to $\theta$ is the fact that $\mathrm E(h(X_n))\to h(\theta)$ for every continuous bounded function $h$. Once you start with this, the road is clear: decompose the difference $\mathrm E(h(X_n)-h(\theta))$ into a part where $X_n$ is close to $\theta$ and a part where $X_n$ is not.

More precisely, assume that $|h(x)|\leqslant M$ for every $x$ and choose a positive $\varepsilon$. Since $h$ is continuous at $\theta$, there exists $\delta$ such that $|x-\theta|\leqslant \delta$ implies $|h(x)-h(\theta)|\leqslant\varepsilon$, hence $ |\mathrm E(h(X_n))-h(\theta)|\leqslant \mathrm E(|h(X_n)-h(\theta)|)\leqslant2M\mathrm P(|X_n-\theta|\geqslant\delta)+\varepsilon. $ When $n\to\infty$, $\mathrm P(|X_n-\theta|\geqslant\delta)\to0$ hence $\limsup|\mathrm E(h(X_n))-h(\theta)|\leqslant\varepsilon$. This holds for every positive $\varepsilon$ hence $\mathrm E(h(X_n))\to h(\theta)$.

Nota: This result (convergence in probability implies convergence in distribution) holds for every limit in probability, not necessarily deterministic. As @Nate mentioned in a comment, the reverse implication (convergence in distribution implies convergence in probability) is true as well, but only when the limit is almost surely constant.