Proof (without use of differential calculus) that $e^{\sqrt{x}}$ is convex on $[1,+\infty)$.

Solution 1:

To show convexity it is enough to prove the property of a supporting hyperplane at each point, i.e. $$e^{\sqrt{x+t}}\geq e^{\sqrt x}+t\cdot \tfrac{1}{2\sqrt x}e^{\sqrt x}\tag{*}$$ whenever $x\geq 1$ and $t+x\geq 1.$

I will just recall here how your definition of convexity follows from (*), to show that it does not use calculus. Given $y,z\geq 1$ and $0\leq \lambda\leq 1,$ set $x=\lambda y+(1-\lambda)z.$ Then (*) gives

\begin{align*} e^{\sqrt y}&\geq e^{\sqrt{x}} + (y-x)\cdot \frac{1}{2\sqrt x}e^{\sqrt x}\\ e^{\sqrt z}&\geq e^{\sqrt{x}} + (z-x)\cdot \frac{1}{2\sqrt x}e^{\sqrt x}\\ &\implies\\ \lambda e^{\sqrt y}+(1-\lambda)e^{\sqrt z}&\geq e^{\sqrt{x}} + (\lambda(y-x)+(1-\lambda)(z-x))\cdot \frac{1}{2\sqrt x}e^{\sqrt x}\\ &= e^{\sqrt{x}}. \end{align*}


It remains to show (*). For $t\leq 0$ we can use $$\sqrt{x+t}-\sqrt{x}=\frac{t}{\sqrt{x+t}+\sqrt{x}}\geq \frac{t}{2\sqrt x}$$ to get $e^{\sqrt{x+t}-\sqrt x}\geq e^{t/2\sqrt x}\geq 1+t/2\sqrt x,$ which is (*).

For $t\geq 0$ we can find an $h\geq 0$ such that $1+\tfrac{t}{2\sqrt x}=e^h.$ Since $h\geq 0$ we have $e^h\geq 1+h+h^2/2,$ so

\begin{align*} x+t&=x+2\sqrt{x}(e^h-1)\\ &\geq x+2h\sqrt x+h^2\sqrt x\\ &\geq x+2h\sqrt x+h^2\\ &=(h+\sqrt x)^2. \end{align*} (This is where we need $x\geq 1.$) Taking square roots and exponentiating gives $e^{\sqrt{x+t}}\geq e^{h+\sqrt x},$ which is (*).

Solution 2:

I'm going to assume we are permitted to use differentiation to prove an intermediate result.

(This seems reasonable, as the exponential function is normally treated using differential calculus; but if it is not acceptable, then perhaps the intermediate result can be derived in some other way.)

Stop Press In the addendum below, the inequality is proved without using differentiation.

Lemma If $\varphi(t) = [\log(1 + t)]^2 + 2\log(1 + t)$, then $\varphi(2t) < 2\varphi(t)$ for all $t > 0$.

Proof Sneakily differentiating: \begin{align*} \tfrac{1}{2}\varphi'(t) & = \frac{1 + \log(1 + t)}{1 + t}, \\ \tfrac{1}{2}\varphi''(t) & = -\frac{\log(1 + t)}{(1 + t)^2} < 0, \end{align*} therefore $\varphi'(t)$ is strictly decreasing for all $t \geqslant 0$. In particular, $\varphi'(t) > \varphi'(2t)$ for all $t > 0$.

Putting $\psi(t) = 2\varphi(t) - \varphi(2t)$, we have $\tfrac{1}{2}\psi'(t) = \varphi'(t) - \varphi'(2t) > 0$ for all $t > 0$, i.e. $\psi$ is strictly increasing. Since $\psi(0) = 0$, we have $\psi(t) > 0$ for all $t > 0$. $\square$

Because $f(x) = e^{\sqrt{x}}$ is a strictly increasing function from $[1, \infty)$ to $[e, \infty)$, its inverse $g$ is the strictly increasing function from $[e, \infty)$ to $[1, \infty)$ given by $g(y) = (\log y)^2$. The strict convexity of $f$ is equivalent to the strict concavity of $g$, so we prove the latter. Since $g$ is continuous, it is sufficient to prove strict midpoint concavity. That is, it is enough to prove: $$ g[(1 + t)y] - g(y) > g[(1 + 2t)y] - g[(1 + t)y] \quad (y \geqslant e, \ t > 0). $$ This simplifies to: $$ \log(1 + t)[\log(1 + t) + 2\log y] > [\log(1 + 2t) - \log(1 + t)][\log(1 + 2t) + \log(1 + t) + 2\log y]. $$ Suppose for the moment that this holds in the particular case $y = e$, $\log y = 1$. Then for arbitrary $y \geqslant e$ we have: $$ 2\log(1 + t) = \log\left[(1 + t)^2\right] > \log(1 + 2t), $$ whence: $$ \log(1 + t)(2\log y - 2) \geqslant [\log(1 + 2t) - \log(1 + t)](2\log y - 2), $$ and the required inequality therefore holds for all $y \geqslant e$. We have thus reduced the desired inequality to one not involving $y$: $$ \log(1 + t)[\log(1 + t) + 2] > [\log(1 + 2t) - \log(1 + t)][\log(1 + 2t) + \log(1 + t) + 2], $$ which simplifies to: $$ 2[\log(1 + t)]^2 + 4\log(1 + t) > [\log(1 + 2t)]^2 + 2\log(1 + 2t). $$ The above Lemma now completes the proof. $\square$

Addendum

Changing notation: let $j \colon \mathbb{R}_{\geqslant0} \to \mathbb{R}_{\geqslant0}$ be the strictly increasing continuous function given by $$ j(y) = (\log(1 + y) + 1)^2 - 1 \quad (y \geqslant 0). $$ This function $j$, which is denoted by $\varphi$ above, is the inverse of the strictly increasing continuous function $h \colon \mathbb{R}_{\geqslant0} \to \mathbb{R}_{\geqslant0}$ given by $$ h(x) = \frac{f(1 + x)}{e} - 1 = e^{\sqrt{1 + x} - 1} - 1 \quad (x \geqslant 0). $$ By the lemma, $f$ is strictly convex if and only if: \begin{equation} \label{3114933:eq:1}\tag{1} j(2y) < 2j(y) \quad (y > 0). \end{equation} If \eqref{3114933:eq:1} holds, then, for all $x > 0$, $$ 2x = 2j(h(x)) > j(2h(x)), $$ which proves: \begin{equation} \label{3114933:eq:2}\tag{2} h(2x) > 2h(x) \quad (x > 0). \end{equation} The necessity of \eqref{3114933:eq:2} is admittedly obvious. Its sufficiency seems less obvious, although it can probably be proved without going round the houses like this. Carrying on, anyway (so as not to edit my existing answer): the converse argument is formally identical. That is, if \eqref{3114933:eq:2} is satisfied, then, for all $y > 0$, $$ 2y = 2h(j(y)) < h(2j(y)), $$ which proves \eqref{3114933:eq:1}. Thus, \eqref{3114933:eq:1} and \eqref{3114933:eq:2} are equivalent; so \eqref{3114933:eq:2} is another necessary and sufficient condition for the strict convexity of $f$. $\square$


Perhaps this is tractable after all! (Have I made a silly mistake?)

By the foregoing, it is enough to prove that the function $$ \frac{h(x)}{x} = \frac{e^{\sqrt{1 + x} - 1} - 1}{x} \quad (x > 0) $$ is strictly increasing. Therefore, it is enough to prove that the function $$ \psi(u) = \frac{e^{u - 1} - 1}{u^2 - 1} \quad (u > 1) $$ is strictly increasing. But: \begin{align*} e^{u - 1} - 1 & = (u - 1) + \frac{(u - 1)^2}{2} + (u - 1)^3\rho(u) \\ & = \frac{u^2 - 1}{2} + (u - 1)^3\rho(u), \end{align*} where $$ \rho(u) = \sum_{n=0}^\infty\frac{(u - 1)^n}{(n + 3)!} \text{ is strictly increasing for all } u \geqslant 1. $$ Therefore: $$ \psi(u) = \frac{1}{2} + \frac{(u - 1)^2}{u + 1}\rho(u) \quad (u > 1), $$ whence it is enough to prove - without, of course, using differentiation! - that $\frac{(u - 1)^2}{u + 1}$ is strictly increasing, for all $u \geqslant 1$. But if $a > b \geqslant 0$, then $$ \frac{a^2}{a + 2} - \frac{b^2}{b + 2} = \frac{(a - b)(ab + 2a + 2b)}{(a + 2)(b + 2)} > 0, $$ and this completes the proof.