1
$\begingroup$

Following is a part of an answer which was not resolved when I tried to answer a question in mathoverflow. I thought it would be nice to discuss that here.

Let $P$ and $Q$ be two distinct distributions on a finite set. For $0\le \lambda \le 1$, let $L(\lambda)=D(P\|R_{\lambda})-D(Q\|R_{\lambda}),$ where $R_{\lambda}=\lambda P+(1-\lambda) Q.$ I want to know the range of $\lambda$ for which $L(\lambda)\ge 0$.

I found that $\frac{d}{d\lambda}L(\lambda)=-\sum \frac{(P(a)-Q(a))^2}{R_{\lambda}(a)}\le 0$. So $L(\lambda)$ is decreasing. Also $L(0)=D(P\|Q)>0$ and $L(1)=-D(Q\|P)<0$. So at some $\lambda$, $L(\lambda)=0$.

The problem would be solved if we can find that $\lambda$. Is there a way to find it?

  • 0
    Not using the @ thing altogether is another way to ensures the system does not signal your comment to the person it should be signaled to. My comments are signaled to you although I did not use any @ because you are the author of the question (this is the exception).2011-10-07

0 Answers 0