1
$\begingroup$

I was studying for an exam and i found an interesting exercise, but very very bad redacted.

A coin is thrown until the first face is found. Denote as X the number of throws required. And find:

a) The entropy H(x) in bits. Next expressions are usefully (the text says).

$\sum r^n = \frac{1}{1-r} \qquad \qquad \sum nr^n = \frac{r}{(1-r)^2}$

b) If a Random Variable X is defined with this distribution. Find the sequence of "efficient" questions of yes/no questions in the form of "Is X contained in the set S?" Compare H(x) with the expected number of questions to determine x.

I think It's a tricky question, because the number of possible results given by a coin are 2 so the entropy will be always 2, and about the second quesiton I don't really understand it, may someone lend me a hand?

Regards.

1 Answers 1

1

Answer a) This is a geometric distribution, so you should be able to derive it's entropy as $H(x)=H_b (p)/p$ where $H_b(p)$ is the binary entropy function.

Answer b) Assuming a fair coin, the best series of questions are: Is $X=1$, if not, is $X=2$, if not is $X=3$ and so on...the series representation of this is $\sum_0^\infty n/2^n $. Once you solve this series it will be apparent that the expected number of questions has a nice relationship with $H(x)$.