1
$\begingroup$

If we have an unbiased coin and we want to generate a biased coin with probability $p$ of getting a head and $1-p$ of getting a tail. What is the lower bound of the expected number of flips that generating this biased coin?

  • 1
    I guess you are looking for [entropy](http://en.wikipedia.org/wiki/Entropy_%28information_theory%29), in such case you need $p\log_2 (p) + (1-p)\log_2(1-p)$ coin flips to get your result (note that for $p\in [0,1]$ the result is in $[0,1]$ too, i.e. for $p \neq 0.5$ you will need less than one coin flip). (In other cases, what happens if $p$ is transcendental?)2012-03-26

2 Answers 2

2

This answer concerns the maximal (rather than expected) number of tosses; which is not what is asked for.


After flipping a fair coin $n$ times, you have $2^n$ equally likely outcomes. Every event defined in terms of these outcomes has probability $k/2^n$ for some $k\in\{0,\dots,2^n\}$. And conversely, for every $k$ there is such an event. Conclusion:

Required number of flips is $ \inf\{n: 2^np\in\mathbb Z\}$.

Which is infinite when $p$ is not a dyadic rational; meaning that for such $p$ you can't simulate the biased coin at all.

Example: if $p=0.375$, you need $3$ flips.

  • 0
    @ShreevatsaR Good point; would you like to give a correct answer to this question?2013-07-16
1

If you expand $p$ in binary, then take head of your flips as a $1$, tail as a $0$, you can state head or tail as soon as you disagree with $p$. There is no lower bound, as the sequence of flips could match the expansion of $p$ as long as you want. Say $p=\frac 13=0.01010101\overline{01}_2$. As long as you alternate tails and heads you can't tell. With probability $1$ that will stop sometime. The expected number of flips is $\sum_{i=1}^{\infty} \frac i{2^i}=2$

  • 0
    Actually now I understand why you say there is no lower bound: in the case of $p=\frac1{100}$ for instance, there is *less* information than in the $p=\frac12$ case, because most of the time we can just say "tails". (I was earlier thinking of the $(\frac1{100}, \frac1{100}, \dots, \frac1{100})$ distribution rather than the $(\frac1{100}, \frac{99}{100})$ distribution we care about here.) However, there is still a lower bound *in terms of $p$,* I think it's $H(p)/H(\frac12)$ i.e. $p\lg\frac1p+(1-p)\lg\frac1{1-p}$. This is < 1 though.2013-07-16