3
$\begingroup$

I am encountering a situation where I cannot calculate exact sum of a seris of logorithms in calculating entropy. Suppose we have a series of numbers $p_i$ and we want to calculate $\sum_ilog(p_i)$, we should multiply $p_i$ together then take the log, or just take the log of each $p_i$ and calculate the summation over them?

The $p_i$ satisfies that $\sum_i p_i = 1$ and each $p_i$ is between 0 and 1.

Things are, we multiply them together, seems I will encounter the loss of precision and calculate the multiplication will accumulate the loss of precision. But when doing log apart, then each log seems to have loss of precisions.

  • 1
    You would actually want to find $\sum_i p_i\log p_i$, know? In standard texts, they always find $\log$ of each $p_i$ and multiply by $p_i$ and sum it up. But I do not know which is better and why.2012-10-01

1 Answers 1