3
$\begingroup$

I have a (possibly simple) question:

If a sequence of scalar random variables, $\{ X_T \}_{T=1}^{\infty}$, convergences in probability to a constant $c$, does that imply that the variance of $X_T$ converges to $0$? In other words, if one has an estimator $\hat{\beta}_T$ of a true parameter value $\beta_0$ with $Var(\hat{\beta}_T)$ converging to a non-zero constant, can this estimator be a consistent estimator of $\beta_0$?

Thanks in advance.

  • 0
    It's worth mentioning that although the answer to this question is "no," as in the answers below, there do exist conditions such that convergence in probability does guarantee that the variance converges to 0; if the $X_t^2$ are [uniformly integrable](http://en.wikipedia.org/wiki/Uniform_integrability) you get the implication, but depending on your level this might be quite technical.2012-05-11

2 Answers 2

4

No. Try $\mathrm P(X_T=0)=1-2/T$ and $\mathrm P(X_T=2^T)=\mathrm P(X_T=-2^T)=1/T$.

Edit Consider an estimator $X_T$ of a true value $c$ and assume that $X_T$ converges to $c$ in probability, thus $X_T$ is consistent. Then, the variance of $X_T$ may or may not go to zero. In particular the variance may not go to zero and the estimator be nevertheless consistent.

  • 0
    No. See Edit. $ $2012-05-11
2

No. Another example: imagine your parameter $\hat{\beta}_T$ happens to follow a Cauchy distribution with fixed center and scale ("width") that tends to zero with $T$. Then, it will converge in probability, but its variance will no tend to zero - actually, it will be infinite for all $T$.

Often one proves consistency by showing that the MSE (or both variance and bias) tends to zero. But this is a sufficient, not necessary condition. So one cannot prove that an estimator is not-consistent by showing that its variance does not tend to zero: that in itself proves nothing.

  • 0
    @ByronSchmuland: good observation, fixed.2012-05-11