We have a random generator that generates independent random bits with probability $P(x=1) = P$ and $P(x=0)=1-P$.
Given $N$ random independent bits, we estimate $P$ by $\hat{P} = N_1/(N_0+N_1)$. where $N_0$ is the number of $0$'s and $N_1$ is number of $1$'s. The expected value for $\hat{P}$ can be simply shown to be $P$.
a. What is the expected value for the estimated entropy defined as following: $\hat{H}=-[ \hat{P} \log(\hat{P}) + (1-\hat{P}) \log(1-\hat{P}) ]$
b. If we take $M$ independent sets of $N$ random bits as above and each time estimate the entropy using the above equation, What is the expected value for the smallest estimated entropy among those $M$ sets ?
thanks, MG
P.S. If the solution to the integral for general P is too complicated, a solution to the special case $P=1/2$ would be appreciated as well.