I am encountering a situation where I cannot calculate exact sum of a seris of logorithms in calculating entropy. Suppose we have a series of numbers $p_i$ and we want to calculate $\sum_ilog(p_i)$, we should multiply $p_i$ together then take the log, or just take the log of each $p_i$ and calculate the summation over them?
The $p_i$ satisfies that $\sum_i p_i = 1$ and each $p_i$ is between 0 and 1.
Things are, we multiply them together, seems I will encounter the loss of precision and calculate the multiplication will accumulate the loss of precision. But when doing log apart, then each log seems to have loss of precisions.