3
$\begingroup$

I have a (possibly simple) question:

If a sequence of scalar random variables, $\{ X_T \}_{T=1}^{\infty}$, convergences in probability to a constant $c$, does that imply that the variance of $X_T$ converges to $0$? In other words, if one has an estimator $\hat{\beta}_T$ of a true parameter value $\beta_0$ with $Var(\hat{\beta}_T)$ converging to a non-zero constant, can this estimator be a consistent estimator of $\beta_0$?

Thanks in advance.

  • 0
    It's worth mentioning that although the answer to this question is "no," as in the answers below, there do exist conditions such that convergence in probability does guarantee that the variance converges to 0; if the $X_t^2$ are [uniformly integrable](http://en.wikipedia.org/wiki/Uniform_integrability) you get the implication, but depending on your level this might be quite technical.2012-05-11

2 Answers 2