Given a string of random symbols with yet a priori unknown distribution, what are the known algorithms to compute its Shannon entropy?
$H = - \sum_i \; p_i \log p_i$
Is there an algorithm to compute it without calculating the probabilities $p_i$ first? Having calculated the entropy $H_n$ of the first $n$ symbols can I find the entropy $H_{n+m}$ of the $n+m$ symbols (knowing about the first $n$ symbols only $H_n$)?