Suppose we have $Y$, a random variable that takes values of 0 and 1 with probabilities $p$ and $1-p$, respectively. Then we let $A$ be a trivial estimator that is always equal to $1\over 2$. What is the MLE $\hat{p}$ of p?
Question about maximum likelihood estimator
-
0I do not understand *Then we let $A$ be a trivial estimator that is always equal to $\frac12$*. – 2012-03-08
1 Answers
Hints:
If you sample $Y$ $n$ times and get a $1$ $k$ times then the likelihood is proportional to $p^{n-k}(1-p)^{k}.$ So you want to find the $p$ which maximises this in general to have a maximum likelihood estimator.
The likelihood is a continuous function of $p$, non-negative for $0 \le p \le 1$, and differentiable, so one obvious approach is to set the derivative with respect to $p$ equal to zero: if $k$ and $n-k$ are each two or more then there will be three solutions, two of which do not maximise the likelihood while the third does. This is $\hat{p}=\frac{n-k}{n} = 1-\overline{Y}.$
If you only observe $Y$ once, the likelihood becomes $p^{1-Y}(1-p)^Y$ so you either get $(1-p)$ if $Y=1$, or $p$ if $Y=0$, both linear so maximised at an extreme. The maximum likelihood estimate is $\hat{p}=0$ if $Y=1$, and $\hat{p}=1$ if $Y=0$, i.e. the estimator is $\hat{p} = 1-{Y}.$
-
0@Mathematics: $p^{1-Y}(1-p)^{Y}$ is equivalent to saying $p$ is the likelihood when $Y=0$ and $1-p$ when $Y=1$ providing that $Y$ can only take those values. $Y$ and $1-Y$ contain the same amount of information about $p$ as each other, but in a different form – 2012-03-08