Kullback–Leibler divergence between two parametrized distributions is defined as:
$$ D_{KL}(q(\theta) || p(\theta)) = \int q(\theta) \log \frac{q(\theta)}{p(\theta)} d\theta $$
Rényi divergence is defined as:
$$ D_{\alpha}(q(\theta) || p(\theta)) = \frac{1}{\alpha-1} \log\int p(\theta)^\alpha q(\theta)^{1-\alpha} d\theta $$
It is known that the KL divergence is a particular case of Rényi divergence when $\alpha \rightarrow 1$.
But what is the proof for that?