24
$\begingroup$

For $a_1,...,a_n,b_1,...,b_n>0,\quad$ define $a:=\sum a_i,\ b:=\sum b_i,\ s:=\sum \sqrt{a_ib_i}$.

Is the following inequality true?: $${\frac{\Bigl(\prod a_i^{a_i}\Bigr)^\frac1a}a \cdot \frac{\left(\prod b_i^{b_i}\right)^\frac1b}b \le\left(\frac{\prod (a_ib_i)^{ \sqrt{a_ib_i}}}{s^{2s}}\right)^\frac s{ab}}.$$ If true, how can I prove it? And if not true, is there a counter-example?


An alternate formulation of this inequality, as suggested by cardinal is

$$\sum_i x_i^2 \log x_i^2 + \sum_i y_i^2 \log y_i^2 \leq \rho^2 \sum_i \frac{x_iy_i}{\rho} \log \frac{x_iy_i}{\rho} \; ,$$

in which $x_i,y_i>0$, $\sum_i x_i^2 = \sum_i y_i^2 = 1$ and $\sum_i x_i y_i = \rho$. It can be found by taking the logarithm of the original expression and defining $x_i=\sqrt{a_i/a}$ and $y_i=\sqrt{b_i/b}$.

  • 0
    My answer is wrong. I should have done the limit by hand...2011-05-26
  • 13
    Let $u_i = a_i / a$, $v_i = b_i / b$ and $w_i = \sqrt{a_i b_i} / s$. Then, the inequality in the question is equivalent to the inequality $\sum u_i \log u_i + \sum v_i \log v_i \leq \frac{2 s^2}{a b} \sum w_i \log w_i$. Note that $(u_i)$, $(v_i)$ and $(w_i)$ are all in the $(n-1)$-simplex and so correspond to discrete probability measures on $\{1,\ldots,n\}$. Probabilistically, we might rewrite this as $H(u) + H(v) \geq \frac{2 s^2}{a b} H(w)$ where $H(\cdot)$ denotes the Shannon entropy.2011-05-27
  • 4
    Note also that there is a, perhaps, interesting element of symmetry at play here. In both the OP's question and my previous remark, the left-hand side is permutation invariant with respect to the relative ordering of the coordinates of $(a_i)$ and $(b_i)$, whereas the right-hand side is not. In particular, let $s_\pi = \sum \sqrt{a_i b_{\pi(i)}}$, $w_{i,\pi} = \sqrt{a_i b_{\pi(i)})}/s_\pi$, and $w_\pi = (w_{i,\pi})$. Then $H(u) + H(v) \geq \frac{2}{n! ab} \sum_{\pi \in S_n} s_\pi^2 H(w_\pi)$ and $H(u) + H(v) \geq \frac{2}{ab} \max_{\pi \in S_n} s_\pi^2 H(w_\pi)$.2011-05-28
  • 0
    Yes, exactly... May we prove it for some small values of $n$? e.g. $n=3$ or $n=4$.2011-05-30
  • 0
    I've created a Maple worksheet for the case $n=2$ and I don't seem to find any counterexample.2011-05-30
  • 0
    @Amir: The case $n = 1$ is easy. ;) I also found no counterexample for $n=2$, but that's hardly a proof.2011-05-30
  • 2
    @Amir Where did this problem come from?2011-05-30
  • 0
    @cardinal: I think the permutation giving the highest value for $H(w_{\pi})$ is the one in which $a$ is ordered in decreasing order and $b$ in decreasing order. This way, you obtain the most "mixed" state possible, and these have highest entropy. I suspect mixedness, also called stochastic ordering, might be useful to solve this problem.2011-06-03
  • 0
    @Raskolnikov: Did you mean *decreasing in $a$ and increasing in $b$* by chance? To increase the entropy we should make $w$ closer to $(1/n,\ldots,1/n)$.2011-06-03
  • 0
    Yes, that's what I meant. Stupid copy/paste. Or increasing in $a$ and decreasing in $b$. I did some simulations and I also saw that if the left hand side is replaced by $n\sum_i -u_{(i)}v_{(i)}\log u_{(i)}v_{(i)}$ where both $u$ and $v$ are in increasing order, then the inequality seems to still hold. Note that this replacement is a lower bound for the left hand side.2011-06-03
  • 0
    @Raskolnikov: Also, the $s^2_\pi$ out front cannot be neglected, I don't believe. $s^2/ab$ acts as a sort of cosine or correlation coefficient. Indeed, an alternate formulation I have is to consider $x \in [0,1]^n$, $y \in [0,1]^n$ such that $\|x\|_2 = \|y\|_2 = 1$. Then, the inequality is equivalent to $$\rho \sum x_i y_i \log(x_i y_i/\rho) \geq \sum x_i^2 \log x_i + \sum y_i^2 \log y_i$$ where $\rho = \langle x, y\rangle$, $x_i = u_i^{1/2}$, $y_i = v_i^{1/2}$ with $u_i$ and $v_i$ defined as in my first comment above. Haven't had time to play with that formulation, though.2011-06-03
  • 0
    @cardinal: Yep, noticed that as well, it's a pretty tight inequality. That's why I was toying to see what I could throw away to simplify it and see if it still works or not.2011-06-03
  • 0
    @DJC: I saw this problem a while ago, but there was no answer for it.2011-06-03

2 Answers 2

9

This is slightly too long to be comment, though this is not an answer:


Edit: I am providing a few revisions, but I still have not found a proof.

My previous comment was not quite correct. I "rearranged" the inequality by raising both sides of the inequality to certain powers, and I did not take into account that we do not know a priori whether any of the terms are larger or smaller than $1$. Depending on the size of each term (in particular, whether it is smaller or larger than $1$), exponentiating both sides could potentially reverse the inequality. I have returned the question to its original form as suggested by the OP, and I have added one more observation.


Let me first say that I have been working on this question for a bit, and though I have not yet resolved it, I have been having fun trying!

Now, to emphasize the dependence on $n$, let's set

$$ \alpha_n = \sum_{i=1}^n a_i \qquad \beta_n = \sum_{i=1}^n b_i, \qquad \sigma_n = \sum_{i=1}^n c_i, $$

where $c_i = \sqrt{a_i b_i}$. Further, let's put

$$ A_n = \prod_{i=1}^n (a_i)^{a_i}, \qquad B_n = \prod_{i=1}^n (b_i)^{b_i}, \qquad S_n = \prod_{i=1}^n (c_i)^{c_i}. $$

Our goal is to show:

\begin{equation} (1) \hspace{1in} \left(\frac{A_n}{(\alpha_n)^{\alpha_n}}\right)^{\frac{1}{\alpha_n}} \cdot \left(\frac{B_n}{(\beta_n)^{\beta_n}} \right)^{\frac{1}{\beta_n}} \leq \left(\frac{S_n}{(\sigma_n)^{\sigma_n}}\right)^{\frac{2\sigma_n}{\alpha_n \beta_n}} \end{equation}

A few pedestrian observations:

  • If $a_i = b_i$ for $i = 1, \dots , n$ (which forces $c_i = a_i = b_i$), then $A_n = B_n = S_n$, we also have $\alpha_n = \beta_n = \sigma_n$, and (1) holds in this case.

  • Note that $2c_i \leq a_i + b_i$ as $2xy \leq x^2 + y^2$ for all real numbers $x, y$. Hence, $2\sigma_n \leq \alpha_n + \beta_n$. Furthermore, Cauchy-Schwarz gives $\sigma_n^2 \leq \alpha_n \beta_n$. Both of these observations imply that $(\sigma_n + 1)^2 \leq (\alpha_n + 1)(\beta_n + 1)$.

I would imagine that with enough creativity, one may find a proof of the inequality involving convexity or a simple application of the AM-GM inequality (which I suppose is much the same thing!).

I have been unable to prove the inequality even in the case $n = 2$, when I assume $\alpha_n = \beta_n = 1$. I am not hopeful for a proof of the general case.

  • 1
    Thanks for the start! I'm still waiting for a solution to this problem...2011-05-28
  • 1
    @Amir: And I'm still thinking about it (:2011-05-28
  • 0
    BUMP this! Anyone? Just three days remaining...2011-06-02
  • 0
    The bounty was going to be expired, so I gave it to DJC's post. Problem hasn't been solved yet, if anyone got anything, please share it with us.2011-06-04
  • 0
    @Amir: I did not contribute to a solution of the problem at hand despite my efforts, and while I'm flattered to get the points, I simply don't deserve them. If there is a way to undo the points, I'm fine if they get taken away.2011-06-04
  • 0
    I'm still thinking about a possible solution to the problem though, because this it is really nice. It is simple enough that it seems it should have a simple solution, although none has been found.2011-06-04
  • 0
    @DJC: Never mind, you deserve it man. :) Thanks for trying my problems.2011-06-04
7

The inequality is true.

I posted cardinal's reformulation on Math Overflow, as I was very interested in seeing the problem resolved. A complete solution was provided by fedja, and it can be found here. Using the inequality $$ \sqrt{\frac{a_i b_i}{ab}}\leq \sum_{i=1}^n\sqrt{\frac{a_i b_i}{ab}}\leq \sqrt{\frac{a_i b_i}{ab}}+\sqrt{\frac{(a-a_i)(b- b_i)}{ab}},$$ which follows from Cauchy-Schwarz, fedja was able to consider the inequality entry-wise. Doing so, the problem was reduced to showing

(key inequality) For $x,y,z>0$ and $xy=z^2$, we have $$(1+x)\log(1+y)+(1+y)\log(1+x)\geq 2(1+z)\log(1+z).$$

Fedja gave a proof of this inequality in his answer, and the user Babylon5 gave an alternate proof on AoPS.