I have a zero-mean Gaussian random variable $Y\sim\mathcal{N}(0,\sigma^2_X)$ with known variance $\sigma_X^2$. I also have a zero-mean random variable $X$, which may be dependent on $Y$ (though, I can tolerate independence assumption if it doesn't hurt the final bound too much). Besides having mean zero, I know two sets of facts about $X$:
- Its variance, while unknown, is greater than $\sigma_X^2$;
- One (or more, but one is sufficient) of the following facts hold about the distance between distributions of $X$ and $Y$: $\begin{align} D(p_Y\|p_X)&=\int_{-\infty}^{\infty}\frac{\exp(-x^{2}/2\sigma_X^2)}{\sqrt{2\pi}\sigma_X}\log\frac{\exp(-x^{2}/2\sigma_X^2)/\sqrt{2\pi}\sigma_X}{p_X(x)}dx\leq\epsilon_{KL}\\ H^2(Y,X)&=1-\int_{-\infty}^{\infty}\sqrt{\frac{\exp(-x^{2}/2\sigma_X^2)}{\sqrt{2\pi}\sigma_X}p_X(x)}dx\leq\epsilon_{H}\\ TV(Y,X)&=\int_{-\infty}^{\infty}\left|\frac{\exp(-x^{2}/2\sigma_X^2)}{\sqrt{2\pi}\sigma_X}-p_X(x)\right|dx\leq \epsilon_{TV} \end{align}$
$D(p_Y\|p_X)$, $H^2(Y,X)$, and $TV(Y,X)$ are Kullback-Leibler divergence, Hellinger distance, and Total variation distance, respectively. These three quantities are commonly used to characterise distance between distributions.
I am trying to find the maximum variance of $X$ such that the distribution for $X$ satisfies any of the above distance requirements. I don't really care about the distribution, just its second moment.
Does anyone have any ideas?
This is a related to a question I asked earlier, but here I've generalised it here quite a bit.