4
$\begingroup$

I am wondering whether anyone knows of any any inequalities involving the probability density function of an unknown distribution (as opposed to the cumulative distribution function) and its known variance. (We also assume that the mean is zero.)

Background:

I have an upper bound on Kullback–Leibler divergence from $Z$ to $Y=X+Z$

$$D(p_Z\|p_Y)=\int_{-\infty}^{\infty}p_Z(x)\log\frac{p_Z(x)}{p_Y(x)}dx \leq\epsilon$$

where $Z\sim \mathcal{N}(0,\sigma^2)$. $X$ has a symmetric density and mean zero, but is otherwise unknown, thus I am treating $Y$ as an unknown. I am trying to bound the variance of $Y$ given $\epsilon$. To that end I've considered applying the $\mathcal{L}_1$ distance between the two densities, as per this, half of its square bounds KL divergence from below:

$$\frac{1}{2}\left(\int_{-\infty}^{\infty}|p_Z(x)-p_Y(x)|dx\right)^2 \leq D(p_Z\|p_Y)$$

As I understand, $\mathcal{L}_1$ norm relates to the total variation distance. I do not completely understand why or how, and I posted in a separate question about that after helpful comments to this question. But this question is about the need for the characterizing pdf with variance.

  • 0
    There's no such thing as a "cumulative density function". There are density functions, and there are cumulative distribution functions. But the word "cumulative" contradicts the word "density".2011-10-01
  • 0
    I am a little confused by your last statement and inequality. First of all, what you've listed is *not* the [total variation norm](http://en.wikipedia.org/wiki/Total_variation#Total_variation_of_probability_measures) and second, I believe the inequality you've given is false. Let $Z \sim \mathcal N(0,1)$ and $Y \sim \mathcal N(0,1+\epsilon)$ for small enough $\epsilon > 0$. Then, $p_Z(0) - p_Y(0) > D(p_Z \| p_Y)$.2011-10-01
  • 0
    Thanks, @cardinal I made a pretty large mistake in the background of the question... now fixed. I've also posted an additional related but separate [question](http://math.stackexchange.com/questions/69166/understanding-the-relationship-of-the-l1-norm-to-the-total-variation-distance-of) about the background.2011-10-02
  • 0
    Thanks for the edits. What you've stated is Pinsker's inequality.2011-10-02

0 Answers 0