I'm studing for an information theory exam, maybe some of you can help me here with an exercise.
What's the entropy of $X$ as $\{1,2,\ldots,n\}$ ($n$=infinity) where the probabilities are $P \{1/2^1, 1/2^2,\ldots, 1/2^n\}$?
The question is multiple choice and gives 4 possible answers: 1. $2 \over 3$ bits/symbol; 2. $1 \over 2$ bits/symbol; 3. $\infty$ bits/symbol; 4. none of the above;
So far i got: $ H(X) = - \sum_{i=1}^{n} P(x_i) \cdot\log_2( P(x_i)) $
So in this case, $ H(X) = - \sum_{i=1}^{\infty} {1 \over 2^i} \cdot\log_2\left({1 \over 2^i}\right) $
$\log_2(1/x) = -\log_2(x)$, while $x>0$, so,
$ H(X) = - \sum_{i=1}^{\infty} {1 \over 2^i}\cdot(-i) $
I also know that:
$ \sum_{i=1}^{\infty} a \cdot r^{-i} = {a \over r-1} $
But in this case I think 'a' must be a constant, right?
Wolfram Alpha gives me H(X) = 2 bits/symbol as the result: bit.ly/nbQwgV
It is correct? Any hint?
Greatly apreciated. Cheers.