Following is a part of an answer which was not resolved when I tried to answer a question in mathoverflow. I thought it would be nice to discuss that here.
Let $P$ and $Q$ be two distinct distributions on a finite set. For $0\le \lambda \le 1$, let $L(\lambda)=D(P\|R_{\lambda})-D(Q\|R_{\lambda}),$ where $R_{\lambda}=\lambda P+(1-\lambda) Q.$ I want to know the range of $\lambda$ for which $L(\lambda)\ge 0$.
I found that $\frac{d}{d\lambda}L(\lambda)=-\sum \frac{(P(a)-Q(a))^2}{R_{\lambda}(a)}\le 0$. So $L(\lambda)$ is decreasing. Also $L(0)=D(P\|Q)>0$ and $L(1)=-D(Q\|P)<0$. So at some $\lambda$, $L(\lambda)=0$.
The problem would be solved if we can find that $\lambda$. Is there a way to find it?