4
$\begingroup$

I am wondering whether anyone knows of any any inequalities involving the probability density function of an unknown distribution (as opposed to the cumulative distribution function) and its known variance. (We also assume that the mean is zero.)

Background:

I have an upper bound on Kullback–Leibler divergence from $Z$ to $Y=X+Z$

$D(p_Z\|p_Y)=\int_{-\infty}^{\infty}p_Z(x)\log\frac{p_Z(x)}{p_Y(x)}dx \leq\epsilon$

where $Z\sim \mathcal{N}(0,\sigma^2)$. $X$ has a symmetric density and mean zero, but is otherwise unknown, thus I am treating $Y$ as an unknown. I am trying to bound the variance of $Y$ given $\epsilon$. To that end I've considered applying the $\mathcal{L}_1$ distance between the two densities, as per this, half of its square bounds KL divergence from below:

$\frac{1}{2}\left(\int_{-\infty}^{\infty}|p_Z(x)-p_Y(x)|dx\right)^2 \leq D(p_Z\|p_Y)$

As I understand, $\mathcal{L}_1$ norm relates to the total variation distance. I do not completely understand why or how, and I posted in a separate question about that after helpful comments to this question. But this question is about the need for the characterizing pdf with variance.

  • 0
    Thanks for the edits. What you've stated is Pinsker's inequality.2011-10-02

0 Answers 0