3
$\begingroup$

I have some troubles with understanding of this explanation taken from wikipedia:

"An estimator can be unbiased but not consistent. For example, for an iid sample $\{x _1,..., x_n\}$ one can use $T(X) = x_1$ as the estimator of the mean $E[x]$. This estimator is obviously unbiased, and obviously inconsistent."

Why unbiased, and why inconsistent ...

Can someone explain it in more details?

  • 0
    What do you already know about the definition of each term?2012-03-12
  • 1
    I know that unbiased: it means that expected value of parameters obtained from the process is equal to expected value of parameter obtained for the whole population. And that strong consistency means that when number of samples n increases then estimated value almost surely goes to the value of parameter in whole population2012-03-12
  • 0
    @Darqer:What's unclear?2012-03-13
  • 0
    I cannot understand how unbiased estimator might be inconsistent. If according to the definition expected value of parameters obtained from the process is equal to expected value of parameter obtained for the whole population how can estimator not converge to parameter in whole population.2012-03-13
  • 0
    What if I ask _Why should the estimator converge to the parameter of the whole population?_ Let me explain. Suppose $\mu$ is the parameter of the population. Then $E[T(X)]=E[X_1]=\mu$, so $T(X)=X_1$ is an unbiased estimator. But it is not consistent because, for every $n$, $T(X)=X_1$ which doesn't converge to $\mu$ unless $X_1=\mu$ almost surely (i.e. with probability 1).2012-03-13
  • 0
    IMO the statement would be more clear if writen "This estimator is obviously unbiased, but obviously inconsistent."2013-01-17

2 Answers 2