I was studying for an exam and i found an interesting exercise, but very very bad redacted.
A coin is thrown until the first face is found. Denote as X the number of throws required. And find:
a) The entropy H(x) in bits. Next expressions are usefully (the text says).
$\sum r^n = \frac{1}{1-r} \qquad \qquad \sum nr^n = \frac{r}{(1-r)^2}$
b) If a Random Variable X is defined with this distribution. Find the sequence of "efficient" questions of yes/no questions in the form of "Is X contained in the set S?" Compare H(x) with the expected number of questions to determine x.
I think It's a tricky question, because the number of possible results given by a coin are 2 so the entropy will be always 2, and about the second quesiton I don't really understand it, may someone lend me a hand?
Regards.