Is it true that $\sum_{i=1}^n ( nGx_i^{G} + G^{x_i}) \ge n^2G + G^2n$, for all $x_i>0$, where $G=\prod_{j=1}^nx_j$?
Incomplete Answers
I did not copy the answer by River Li here because the user wants to work on the problem a little bit more.
Answer by Astaulphe.
The inequality is true for $ G \ge 1 $. For simplicity, rewrite it $$ \sum_{i = 1}^n \left(x_i^G + \frac{G^{x_i - 1}}n\right) \overset ?\ge n + G $$ As mentioned by @The.old.boy, $ x \mapsto x^G + \frac{G^{x - 1}}n $ is a convex function. Hence, Jensen's Inequality gives $$ \sum_{i = 1}^n \left(x_i^G + \frac{G^{x_i - 1}}n\right) \ge nm^G + G^{m - 1} $$ where $ m = \frac{x_1 + \dots + x_n}n $ is the arithmetic mean of the $ x_i $. We need to check that $$ nm^G + G^{m - 1} \overset ?\ge n + G $$ knowing that $ m \ge \sqrt[n]G \ge 1 $ by AM-GM. As $ nx^G + G^{x - 1} $ is strictly increasing, we only need to show that $$ nm^{m^n} + m^{n(m - 1)} \ge n + m^n $$ for all $ x \ge 1 $. However the derivative of $ nx^{x^n} + x^{n(x - 1)} - x^n $ is $$ nx^{n - 1}\left(x^{x^n}\left(n\ln x + 1\right) + x^{n(x - 2)}(x + \ln x - 1) - 1\right) $$ and is negative on $ ]0, 1[ $ and positive on $ ]1, \infty[ $ (because the inside is strictly increasing). Hence $$ nm^{m^n} + m^{n(m - 1)} - m^n \ge n\cdot 1^{1^n} + 1^{n(1 - 1)} - 1^n = n $$
The case $ G < 1 $ is substantially harder because you can't rely anymore on Jensen. However, the Tangent Line Trick might do the job. I'll update my answer should I get anywhere.
EDIT : Looking at the function $ f : x \mapsto e^{Gx} + \frac{G^{e^x - 1}}n $ is much more relevant because your inequality becomes $$ f(a_1) + \dots + f(a_n) \ge n + G $$ for all $ a_1 + \dots + a_n = \ln G $ (by setting $ x_i = e^{a_i} $). If $ f $ had exactly one inflexion point, an olympiad brutal technique called n - 1 EV (see here) would imply that the minimum value of $ f(a_1) + \dots + f(a_n) $ is reached when $ n - 1 $ of the $ a_i $s are equal. However $ f $ has either $ 0 $ (in which case $ f $ is convex and the same Jensen trick concludes) or $ 2 $ inflexion points. The technique is adaptable and leaves a simpler inequality to prove:
Because it will allow us to wipe out terms more easily, look at the continuous version:
For all $ k $, $ \lambda_1, \dots \lambda_n > 0 $ and $ a_1, \dots, a_k \in \mathbb R $ with $ \lambda_1 + \dots + \lambda_k = n $ and $ \lambda_1a_1 + \dots + \lambda_ka_k = \ln G $, we have $$ \lambda_1f(a_1) + \dots + \lambda_kf(a_k) \ge n + G $$
First establish the following lemma:
If $ \lambda_1f(a_1) + \dots + \lambda_kf(a_k) $ is minimal, then $ f'(a_1) = \dots = f'(a_k) $ and $ f''(a_1), \dots, f''(a_k) \ge 0 $.
Proof
$ \bullet $ Suppose that $ f'(a_i) \ne f'(a_j) $ and $ \lambda_i = \lambda_j $ (by breaking down $ \max(\lambda_i, \lambda_j) $ if needed). Then we can replace $ a_i, a_j $ by $ a_i + x, a_j - x $. This doesn't change $ \lambda_1a_1 + \dots + \lambda_ka_k $ and Taylor's interpolation gives $$ f(a_i + x) + f(a_j - x) - f(a_i) - f(a_j) \underset{x \rightarrow 0}\sim x(f'(a_i) - f'(a_j)) $$ In particular, we can choose $ x $ to make this difference negative, which shows that we weren't on a minima.
$ \bullet $ Suppose that $ f''(a_i) < 0 $. Then we can replace $ a_i $ by $ a_i - x $ and $ a_i + x $ with respective $ \lambda $s being both $ \frac{\lambda_i}2 $. This doesn't change $ \lambda_1a_1 + \dots + \lambda_ka_k $ and Taylor's interpolation gives $$ f(a_i + x) + f(a_i - x) - 2f(a_i) \underset{x \rightarrow 0}\sim \frac{x^2}2 f''(a_i) < 0 $$ That shows we weren't on a minima.
Then this lemma:
If $ \lambda_1f(a_1) + \dots + \lambda_kf(a_k) $ is minimal, then $ \{a_1, \dots, a_k\} \le 2 $. That is, we can assume that $ k = 2 $.
Proof: $ f $ has at most $ 2 $ inflexion points, which means it has at most $ 2 $ convex parts. On each of these, $ f'' > 0 $ which implies that $ f' $ is injective. As the previous lemma says that all $ f'(a_i) $ must be equal, there is space for only one in each of the convex parts of $ f $.
Thus we can restrict ourselves to the case $ k = 2 $, needing to prove $$ \lambda\left(e^{Ga} + \frac{G^{e^a - 1}}n\right) + (1 - \lambda)\left(e^{G\frac{\ln G - \lambda a}{n - \lambda}} + \frac{G^{e^{\frac{\ln G - \lambda a}{n - \lambda}} - 1}}n\right) \ge n + G $$ for all $ a $ and all $ \lambda \in [0, n] $.
Answer by c-love-garlic
Assume that $G=Constant\geq 1$ and $\sum_{i=1}^{n}x_i\geq 2n$
it's not hard to see that following function is convex on $(0,\infty)$: $$f(x)=nGx^G+G^x$$
As the sum of two convex function .
So we can apply Jensen's inequality :
$$\sum_{i=1}^n ( nGx_i^{G} + G^{x_i}) \ge ( n^2Ga^{G} + nG^{a})$$
Where $a=\frac{\sum_{i=1}^{n}x_i}{n}$
But with the assumptions we have $a^G\geq 2^G$ and $G^a\geq G^2$
So :$$\sum_{i=1}^n ( nGx_i^{G} + G^{x_i}) \ge ( n^2G2^{G} + nG^{2})> n^2G+G^2n$$
Update the case $x_i\leq 1$:
This is an observation by River Li. Here is the quote.
I didn't find a counterexample. By the way, for $x_i\le 1, \forall i$, I have a proof as follows. By AM-GM, we have $$\sum x_i^G \ge n (x_1x_2\cdots x_n)^{G/n} = nG^{G/n} = n \mathrm{e}^{(G\ln G)/n} \ge n (1 + (G\ln G)/n)$$ and $$\sum G^{x_i} \ge n G^{(x_1+x_2+\cdots + x_n)/n} \ge nG\,.$$ It suffices to prove that $$nG \cdot n (1 + (G\ln G)/n) + nG \ge n^2G + G^2n$$ or $$1 - G + G\ln G \ge 0$$ which is true.
Update the case $G\leq 1$ and $n=2k+1$:
Put : $x_i=y_i^{\frac{G+1}{G}}$ such that $|y_{i+1}-y_i|=\epsilon$ $\epsilon>0$ and $y_{n+1}=y_1$ and finally $y_{\frac{n+1}{2}}=1$
We have for the LHS:
$$\sum_{i=1}^{n}(nG(y_i)^{G+1}+G^{x_i})$$
Here I use the Hermite-Hadamard inequality .
The following functions are convex on $(0,\infty)$ (with the notation of the OP):
$$h(x)=nGx^{G+1}\quad r(x)=G^x$$
We have $x_n\geq x_{n-1}\geq \cdots\geq x_2\geq x_1$ and $y_n\geq y_{n-1}\geq \cdots\geq y_2\geq y_1$ and $y_n\geq 1$:
$$\sum_{i=1}^{n}(nG(y_i)^{G+1})\geq nG\Bigg(\frac{1}{(y_2-y_1)}\int_{y_1}^{y_2}h(x)dx+\frac{1}{(y_3-y_2)}\int_{y_2}^{y_3}h(x)dx+\cdots+\frac{1}{(y_n-y_1)}\int_{y_1}^{y_n}h(x)dx\Bigg)$$
Summing and using the additivity of integration on intervals we get :
$$\Bigg(\frac{1}{(y_2-y_1)}\int_{y_1}^{y_n}h(x)dx+\frac{1}{(y_n-y_1)}\int_{y_1}^{y_n}h(x)dx\Bigg)$$
But a primitive of $h(x)$ is :
$$H(x)=nG\frac{x^{G+2}}{G+2}$$
So :
$$\Bigg(\frac{1}{(y_2-y_1)}\int_{ y_1}^{ y_n}h(x)dx+\frac{1}{(y_n-y_1)}\int_{y_1}^{y_n}h(x)dx\Bigg)=\frac{nG}{(y_2-y_1)}\Bigg(\frac{(y_n)^{G+2}}{G+2}-\frac{(y_1)^{G+2}}{G+2}\Bigg)+\frac{nG}{(y_n-y_1)}\Bigg(\frac{(y_n)^{G+2}}{G+2}-\frac{(y_1)^{G+2}}{G+2}\Bigg)$$
Now we have by the Hermite-Hadamard inequality : $$\frac{\frac{(y_n)^{G+2}}{G+2}-\frac{(y_1)^{G+2}}{G+2}}{y_n-y_1}\geq\Big(\frac{y_n+y_1}{2}\Big)^{G+1}= 1$$
And as we have $|y_{i+1}-y_i|=\epsilon$ we get :
$$\frac{nG}{(y_2-y_1)}\Bigg(\frac{(y_n)^{G+2}}{G+2}-\frac{(y_1)^{G+2}}{G+2}\Bigg)+\frac{nG}{(y_n-y_1)}\Bigg(\frac{(y_n)^{G+2}}{G+2}-\frac{(y_1)^{G+2}}{G+2}\Bigg)= \frac{n^2G}{y_n-y_1}\Bigg(\frac{(y_n)^{G+2}}{G+2}-\frac{(y_1)^{G+2}}{G+2}\Bigg)\geq n^2G $$
On the other hand we have with Jensen's inequality:
$$\sum_{i=1}^{n}G^{x_i}\geq nG^{\frac{\sum_{i=1}^{n}x_i}{n}}$$
Assuming that $\sum_{i=1}^{n}x_i\leq 2n$ we have : $$\sum_{i=1}^{n}G^{x_i}\geq nG^{\frac{\sum_{i=1}^{n}x_i}{n}}\geq nG^2$$
Summing the two result we get the desired inequality .
Hope it helps !
Update:
We can apply the same reasoning to $y_i^{\frac{G+\alpha}{G}}=x_i$ instead of $y_i^{\frac{G+1}{G}}=x_i$ with $\alpha> 1-G$ or $\alpha<-G$ it generalize considerably the proof. The proof is still valid if $y_n+y_1\geq 2$ so without the restriction $y_{\frac{n+1}{2}}=1$