Let $x=(x_i)$ be a probability measure on $\{1,\ldots,n\}$. Suppose $1 . The Rényi entropy of $x$ is $ H^p(x)=\frac{1}{1-p}\log \sum_{i} x_i^p. $ Does there exist a formula for $H^p(x)$ using a derivative and a norm ($\ell_{p,q}$, Orlicz,...) of $x$? Remark: I know that for the classical entropy $H(x)=-\sum_{i}\log(x_i) x_i$, we have $ H(x)=\left. \frac{d}{dp}\right|_{p=1}\vert\vert x \vert\vert^p_{\ell_p} $
rényi entropy as a derivative
2
$\begingroup$
functional-analysis
probability-distributions
derivatives
information-theory
entropy
-
1@Zouba: I don't think $H(x)=\left. \frac{d}{dp}\right|_{p=1}\vert\vert x \vert\vert^p_{\ell_p}$ is true. All we have $\lim_{p\to 1}H^p(x)=H(x)$. Your question is also not clear to me. Can you elaborate on what your question exactly is? – 2012-07-06