1
$\begingroup$

I have a strange (well to me at least) MLE problem. If we let $\{X_i\}_{i=1}^n$ be an i.i.d. sample of a random variable $X$ whose mean is $\mu$ and variance $\sigma^2$. Suppose further that $X_1\sim N(\mu,1)$. I must show that the MLE, $\max\{\bar{X}_n,1\}$for $\max\{\mu,1\}$ suffers from bias.

First of all, what does $\max\{\bar{X}_n,1\}$for $\max\{\mu,1\}$ mean?

  • 0
    You seem to assert that $\max\{\bar{X}_n,1\}$ is the MLE for $\max\{\mu,1\}$. Why is that?2012-10-03

4 Answers 4

1

(a) when $\mu \lt 1$ and $\max\{\mu,1\}=1$, $\max\{\bar{X},1\}$ is sometimes strictly greater than $1$ (since $\bar{X} \gt 1$ with positive probability) and is never less than $1$, so $E[\max\{\bar{X},1\}] \gt 1 = \max\{\mu,1\}$ meaning there is upward bias.

(b) when $\mu \gt 1$ and $\max\{\mu,1\}=\mu$, $\max\{\bar{X},1\}$ is sometimes strictly greater than $\bar{X}$ (since $\bar{X} \lt 1$ with positive probability) and is never less than $\bar{X}$, so $E[\max\{\bar{X},1\}] \gt E[\bar{X}] = \mu =\max\{\mu,1\}$ meaning there is upward bias.

For there to be no bias, you need either $\Pr(\bar{X} \le 1) =1$ or $\Pr(\bar{X} \ge 1) =1$, but this is not the case with a normal distribution.

1

First I answer your first question while thinking the rest. You have $\max\{\overline{X},1\}$ as an estimator. That is, you find the sample mean of your data which is supposed to be $\overline{X}$ and take the maximum of $\overline{X}$ and $1$.

This is called your estimator. I havent checked if it is a maximum likelihood estimator yet. If so, it should minimize the likelihood function.

Now you have an estimator (probabily maximum likelihood) which is trying to estimate the maximum of the $\mu$, the parameter of your distribution and $1$.

EDIT: Okay here is the idea: You want to estimate the maximum of $\mu$ and $1$. There might be two cases. Either $\mu>1$ or $\mu<1$. In the first case $\max\{\mu,1\}=1$ else is $\mu$.

Assume that $\mu<1$ then we have $\max\{\mu,1\}=1$. This means our objective is to estimate $1$ from given samples. Now lets have a look at our estimator. This estimator will give you less than half of the times the value that you have and else you will get $1$. When you check the expected value then it will be something less than $1$ which indicates a Bias towards the positive real axis.

On the other hand when $\mu>1$, we have $\max\{\mu,1\}=\mu$. But we have real problem! our estimator will output more than half of the time a number which is greater than $1$ and less than half of the time a value which is less than $1$ which will be eventually rounded to $1$ due to its structure $\max\{\mu,1\}$. As you can see, we want to estimate $\mu$ and unfortunately all the values which are negative and between $\mu$ and $1$ are rounded to $1$. That is not good! this will result a Bias towards the positive real axis. Because in order not to have a Bias, we need to have equally from the values that are less than $\mu$ and greater than $\mu$ in the order. Not the time of occurance. As all the negative values will be mapped to $1$, this estimator will never converge to $\mu$ but something greater than $\mu$.

To show this theoretically. You need to determine the expected value of this estimator in both cases and show that it deviates from the true (to be estimated) value.

  • 0
    @Henry true. misdeclination is corrected.2012-10-03
1

Taking expectations over the inequality $\max\{\bar X_n,1\}-1\geq 1_{(\bar X_n\geq 2)}$ gives $\mathbb{E}(\max\{\bar X_n,1\})-1\geq \mathbb{P}(\bar X_n\geq 2)>0,$ or $\mathbb{E}(\max\{\bar X_n,1\})>1.\tag1$

Similarly, taking expectations over the inequality $\max\{\bar X_n,1\}-\bar X_n\geq 1_{(\bar X_n\leq 0)},$ gives $\mathbb{E}(\max\{\bar X_n,1\})-\mathbb{E}(\bar X_n)\geq \mathbb{P}(\bar X_n\leq 0)>0,$ or $\mathbb{E}(\max\{\bar X_n,1\})>\mathbb{E}(\bar X_n).\tag2$

Combining (1) and (2) gives $\mathbb{E}(\max\{\bar X_n,1\})>\max\{1,\mathbb{E}(\bar X_n)\}$ so $\max\{\bar X_n,1\}$ is a biased estimator of $\max\{1,\mathbb{E}(\bar X_n)\}$.

The probabilities $\mathbb{P}(\bar X_n\geq 2)$ and $\mathbb{P}(\bar X_n\leq 0)$ are strictly positive because $\bar X_n$ has a normal distribution.

1

To ask what "$\max\{\bar{X}_n,1\}$for $\max\{\mu,1\}$" means is to parse the sentence incorrectly. It's asking about $\max\{\bar X_n, 1\}$, which is the MLE for $\max\{\mu,1\}$.

First, do you know how to show that $\bar X_n$ is the MLE for $\mu$? MLEs generally are equivariant under more-or-less everything, i.e. if $\bar X_n$ is the MLE for $\mu$, then for every function $g$, then $g(\bar X_n)$ is the MLE for $g(\mu)$.

So let $g(x)=\max\{x,1\}$. Then conclude that $\max\{\bar X_n, 1\}$ is indeed the MLE for $\max\{\mu,1\}$.

If $\mu>1$ then $\mathbb{E}\bar X = \mu$, and $\Pr(\max\{\bar X_n,1\}> \bar X_n)>0$ and $\Pr(\max\{\bar X_n,1\} < \bar X_n)=0$, and consequently $\mathbb{E}\max\{\bar X_n,1\}>\mathbb{E}\bar X_n =\mu$. Thus the estimator is biased.

  • 0
    Sure it is. As sure as the fact that this will not help the OP since, the function $\mu\mapsto\max\{\mu,1\}$ being many-to-one, the usual theorem needs some reformulation. (Anyway, please try to use the @ notification system, unless you want people to miss that you replied to their comment.)2012-10-06