An information theory inequality which relates to Shannon Entropy
For $a_1,...,a_n,b_1,...,b_n>0,\quad$ define $a:=\sum a_i,\ b:=\sum b_i,\ s:=\sum \sqrt{a_ib_i}$.
Is the following inequality true?: $${\frac{\Bigl(\prod a_i^{a_i}\Bigr)^\frac1a}a \cdot \frac{\left(\prod b_i^{b_i}\right)^\frac1b}b \le\left(\frac{\prod (a_ib_i)^{ \sqrt{a_ib_i}}}{s^{2s}}\right)^\frac s{ab}}.$$ If true, how can I prove it? And if not true, is there a counter-example?
An alternate formulation of this inequality, as suggested by cardinal is
$$\sum_i x_i^2 \log x_i^2 + \sum_i y_i^2 \log y_i^2 \leq \rho^2 \sum_i \frac{x_iy_i}{\rho} \log \frac{x_iy_i}{\rho} \; ,$$
in which $x_i,y_i>0$, $\sum_i x_i^2 = \sum_i y_i^2 = 1$ and $\sum_i x_i y_i = \rho$. It can be found by taking the logarithm of the original expression and defining $x_i=\sqrt{a_i/a}$ and $y_i=\sqrt{b_i/b}$.
This is slightly too long to be comment, though this is not an answer:
Edit: I am providing a few revisions, but I still have not found a proof.
My previous comment was not quite correct. I "rearranged" the inequality by raising both sides of the inequality to certain powers, and I did not take into account that we do not know a priori whether any of the terms are larger or smaller than $1$. Depending on the size of each term (in particular, whether it is smaller or larger than $1$), exponentiating both sides could potentially reverse the inequality. I have returned the question to its original form as suggested by the OP, and I have added one more observation.
Let me first say that I have been working on this question for a bit, and though I have not yet resolved it, I have been having fun trying!
Now, to emphasize the dependence on $n$, let's set
$$ \alpha_n = \sum_{i=1}^n a_i \qquad \beta_n = \sum_{i=1}^n b_i, \qquad \sigma_n = \sum_{i=1}^n c_i, $$
where $c_i = \sqrt{a_i b_i}$. Further, let's put
$$ A_n = \prod_{i=1}^n (a_i)^{a_i}, \qquad B_n = \prod_{i=1}^n (b_i)^{b_i}, \qquad S_n = \prod_{i=1}^n (c_i)^{c_i}. $$
Our goal is to show:
\begin{equation} (1) \hspace{1in} \left(\frac{A_n}{(\alpha_n)^{\alpha_n}}\right)^{\frac{1}{\alpha_n}} \cdot \left(\frac{B_n}{(\beta_n)^{\beta_n}} \right)^{\frac{1}{\beta_n}} \leq \left(\frac{S_n}{(\sigma_n)^{\sigma_n}}\right)^{\frac{2\sigma_n}{\alpha_n \beta_n}} \end{equation}
A few pedestrian observations:
If $a_i = b_i$ for $i = 1, \dots , n$ (which forces $c_i = a_i = b_i$), then $A_n = B_n = S_n$, we also have $\alpha_n = \beta_n = \sigma_n$, and (1) holds in this case.
Note that $2c_i \leq a_i + b_i$ as $2xy \leq x^2 + y^2$ for all real numbers $x, y$. Hence, $2\sigma_n \leq \alpha_n + \beta_n$. Furthermore, Cauchy-Schwarz gives $\sigma_n^2 \leq \alpha_n \beta_n$. Both of these observations imply that $(\sigma_n + 1)^2 \leq (\alpha_n + 1)(\beta_n + 1)$.
I would imagine that with enough creativity, one may find a proof of the inequality involving convexity or a simple application of the AM-GM inequality (which I suppose is much the same thing!).
I have been unable to prove the inequality even in the case $n = 2$, when I assume $\alpha_n = \beta_n = 1$. I am not hopeful for a proof of the general case.
The inequality is true.
I posted cardinal's reformulation on Math Overflow, as I was very interested in seeing the problem resolved. A complete solution was provided by fedja, and it can be found here. Using the inequality $$ \sqrt{\frac{a_i b_i}{ab}}\leq \sum_{i=1}^n\sqrt{\frac{a_i b_i}{ab}}\leq \sqrt{\frac{a_i b_i}{ab}}+\sqrt{\frac{(a-a_i)(b- b_i)}{ab}},$$ which follows from Cauchy-Schwarz, fedja was able to consider the inequality entry-wise. Doing so, the problem was reduced to showing
(key inequality) For $x,y,z>0$ and $xy=z^2$, we have $$(1+x)\log(1+y)+(1+y)\log(1+x)\geq 2(1+z)\log(1+z).$$
Fedja gave a proof of this inequality in his answer, and the user Babylon5 gave an alternate proof on AoPS.