The expectation of the norm of standard Gaussian random vectors

I'm trying to prove $f(n) = \mathbb{E} \|g_n\|_2 -\sqrt{n}$ is an increasing function in $n$, where $g_n \backsim N(0, I_n)$, i.e. $g_n$ is a standard normal random vector in $\mathbb{R}^n$.

This is the final step in proving $\mathbb{E}s_n(A) \geq \sqrt{m}-\sqrt{n}$ for an $m \times n$ random matrix with $N(0, 1)$ entries, which is known as the consequence of Gordon's comparison inequality. And, this is taken from Exercise 7.3.4 of Roman Vershynin's "High-Dimensional Probability." The author states that you can prove that $f(n)$ is an increasing function by a tedious calculation. This fact is also stated in Martin J. Wainwright's "High-Dimensional Statistics" that it can be derived by direct computation. But neither of them describes it further.

I figured

\begin{equation} \mathbb{E} \|g_n\|_2 = \sqrt{2} \frac{\Gamma(\frac{n+1}{2})}{\Gamma (\frac{n}{2})} \end{equation}

is the explicit form of the expectation. But, I'm not sure how I can go further from here. Any help is appreciated very much!


After working for a while, I've solved this problem with elementary calculation, which I would describe as indeed "tedious" as Vershynin described.

We use Kershaw's inequality for Gamma functions, whose proof is purely elementary and elegant.

Other than this inequality, the rest of the work is done through simple, direct calculation.

It suffices to prove:

$$\, f(n+1) - f(n) = \sqrt{2}\left(\frac{\Gamma(\frac{n+2}{2})}{\Gamma(\frac{n+1}{2})}-\frac{\Gamma(\frac{n+1}{2})}{\Gamma(\frac{n+1}{2})}\right) -(\sqrt{n+1}-\sqrt{n}) \geq 0 \\ \iff \sqrt{2}\left(\frac{\Gamma(\frac{n+2}{2})}{\Gamma(\frac{n+1}{2})}-\frac{\Gamma(\frac{n+1}{2})}{\Gamma(\frac{n}{2})}\right) \geq \sqrt{n+1}-\sqrt{n} $$

Use $\Gamma(x+1) = x\Gamma(x)$ a few times to the left-hand side, we get $$ LHS = \sqrt{2}\left(\frac{n+1}{n+2}\frac{\Gamma(\frac{n+4}{2})}{\Gamma(\frac{n+3}{2})}-\frac{n}{2}\frac{n+2}{n+1}\frac{\Gamma(\frac{n+3}{2})}{\Gamma(\frac{n+4}{2})}\right) $$

Here's the Kershaw's inequality we shall use $$ \frac{\Gamma(x+1)}{\Gamma(x+s)} > \left(x+\frac{s}{2}\right)^{1-s}\quad \text{for } x>0 \text{ and } 0<s<1. $$

Apply the inequality with $x=\frac{n+3}{2}$ and $s=1/2$, we get $$ \frac{\Gamma(\frac{n+4}{2})}{\Gamma(\frac{n+3}{2})} > \sqrt{\frac{n+5/2}{2}} $$

From this, we can lower bound LHS as: \begin{array} \, LHS &> \sqrt{2}\left(\frac{n+1}{n+2}\sqrt{\frac{n+5/2}{2}}-\frac{n}{2}\frac{n+2}{n+1}\sqrt{\frac{2}{n+5/2}}\right) \\ &= \frac{n^2+4n+5}{2(n+1)(n+2)\sqrt{n+5/2}} \end{array}

So, it suffices to prove $$ \frac{n^2+4n+5}{2(n+1)(n+2)\sqrt{n+5/2}} > \sqrt{n+1} - \sqrt{n} $$

Squaring both sides, simplifying, and squaring both sides again to remove the root, we can simplify it to $$ 31n^8+392n^7+2112n^6+6272n^5+11042n^4+11416n^3+6224n^2+1120n-225>0, $$ which is valid for any integers $n\geq 1$. The last calculation was indeed tedious.


At this point, I would just brute-force the inequality. Define $$ f(x)=\frac{\Gamma(x+1/2)}{\Gamma(x)}. $$ To establish the result, it suffices to prove that for $x\ge 1$, $$ f'(x)\ge \frac{1}{2\sqrt{x}}. $$ In fact, if we denote $\psi(x)=\Gamma'(x)/\Gamma(x)$, we have $$ f'(x)=f(x)\left(\psi\left(x+\frac{1}{2}\right)-\psi(x)\right). $$ Now we can use various inequalities ([1], [2]) to get $$ \psi\left(x+\frac{1}{2}\right)-\psi(x)\ge \frac{1-e^{-\gamma}}{x}+O(x^{-2}) $$ and $$ f(x)\ge \sqrt{x}+O(x^{-1/2}), $$ which establishes the claim, at least for $x$ large enough (to get it in general, we either check the remaining cases by hand or try to be more precise in our estimates).


Combining Gautschi's inequality and Alzer's inequality naively gives $f'(x)>\frac{\sqrt{x-\frac{1}{2}}}{2x}$ which is not greater than or equal to $\frac{1}{2\sqrt{x}}$. But using $\psi(x+\frac{1}{2})-\psi(x) \geq \frac{1-e^{-\gamma}}{x} + O(x^{-2})$ as md5 said would be enough to prove it for sufficiently large $x$ because $1-e^{-\gamma} \approx 0.438541$ which is strictly less than $1/2$.

Plotting $\frac{\Gamma(x+\frac{1}{2})}{\Gamma(x)} - \sqrt{x}$ it is increasing for $x \in (0,\infty)$ (and it is positive at least for $x>0.4$ or so). A Mathematica command to plot it is here:

    Plot[Gamma(x+1/2)/Gamma(x) - Sqrt[x],{x,0,5}]

For your purpose it would suffice to prove $\frac{\Gamma(x+\frac{1}{2})}{\Gamma(x)} - (\sqrt{x})$ is increasing for $x \in [1,\infty)$. You already know the derivative equals $\exp\left(\int_x^{x+1/2} \psi(y)\, dy\right) \cdot \left(\psi\left(x+\frac{1}{2}\right)-\psi(x)\right)-\frac{1}{2x^{1/2}}$. So you already know that it would suffice to prove that $\psi\left(x+\frac{1}{2}\right)-\psi(x)>\frac{1}{2 x^{1/2}} \cdot \frac{\Gamma(x)}{\Gamma(x+\frac{1}{2})}$. By Kershaw's improvement of Gautschi's inequality

    https://en.wikipedia.org/wiki/Gautschi%27s_inequality 

we have $\frac{\Gamma(x)}{\Gamma(x+\frac{1}{2})}<\exp\left(-\frac{1}{2} \psi\left(x+\frac{\sqrt{2}-1}{2}\right)\right)$ when $x>\frac{1}{2}$. So it suffices to prove that $\psi\left(x+\frac{1}{2}\right)-\psi(x)>\frac{1}{2x^{1/2}}\cdot\exp\left(-\frac{1}{2} \psi\left(x+\frac{\sqrt{2}-1}{2}\right)\right)$. That inequality is numerically verified by plotting. For example, in Mathematica you can see that the difference is positive. (It has an oscillation somewhere in $x \in (0,1)$, but it is positive.)

    Plot[PolyGamma[x+1/2]-PolyGamma[x]-1/(2Sqrt[x])*Exp[-(1/2)PolyGamma[x+(Sqrt[2]-1)/2]],{x,0,2}]

By bounds of Elezovic, Giordano, and Pecaric we have $\exp\left(-\frac{1}{2} \psi\left(x+\frac{\sqrt{2}-1}{2}\right)\right)\leq \frac{1}{\sqrt{x+\frac{1}{\sqrt{2}}}} \exp(1/(2x+\sqrt{2}-1))$.

    https://en.wikipedia.org/wiki/Digamma_function#Inequalities

So it suffices to prove that $\psi\left(x+\frac{1}{2}\right)-\psi(x)>\frac{1}{2\sqrt{x(x+\frac{1}{\sqrt{2}})}}\, \exp(1/(2x+\sqrt{2}-1))$. Using the integral formula for the the digamma function we have $\psi\left(x+\frac{1}{2}\right)-\psi(x) = \frac{1}{x} \int\limits_0^\infty \frac{1}{1+e^{-t/(2x)}}\, e^{-t}\, dt$.

    https://en.wikipedia.org/wiki/Digamma_function#Integral_representations 

So it would suffice to prove $\int\limits_0^\infty \frac{1}{1+e^{-t/(2x)}}\, e^{-t}\, dt > \frac{1}{2\sqrt{1+\frac{1}{x\sqrt{2}}}} \exp(1/(2x+\sqrt{2}-1))$ for all $x>1$. Or equivalently, taking $y=1/x$, it would suffice to prove $\int\limits_0^\infty \frac{1}{1+e^{-yt/2}}\, e^{-t}\, dt > \frac{1}{2\sqrt{1+\frac{y}{\sqrt{2}}}} \exp\left(\frac{y}{2+(\sqrt{2}-1)y}\right)$ for all $y \in (0,1)$. By making the change-of-variables $z=y/(4-y)$ this inequality is equivalent to $\int\limits_0^{\infty} \frac{\exp(-t)}{\cosh(tz)}\, dt>\frac{1}{\sqrt{(1+z)(1+(2\sqrt{2}+1)z)}}\, \exp\left(\frac{2z}{1+(2\sqrt{2}-1)z}\right)$ for $z \in (0,\frac{1}{3})$. (An intermediate step was not listed, hopefully error-free.) Mathematica's formula for the integral $\int_0^{\infty}\frac{e^{-t}}{\cosh(tz)}\, dt$ is expressed in terms of Euler's Harmonic numbers $H_z=\int_0^1 \frac{1-t^z}{1-t}\, dt$

    https://en.wikipedia.org/wiki/Harmonic_number#Harmonic_numbers_for_real_and_complex_values

Using this, we can see that this inequality is still numerically verified

    In[4]:= A:=(-HarmonicNumber[1/4(-3+1/z)]+HarmonicNumber[(1/4)(-1+1/z)])/(2z)

    In[5]:= B := Exp[2z/(1+(2 Sqrt[2]-1)z)]/Sqrt[(1+z)(1+(2Sqrt[2]+1)z)]

    In[6]:= Plot[A-B,{z,0,5}]

Note that both the left-hand-side $A$ and the right-hand-side $B$ are equal to $1$ at $z=0$.

Numerically, by looking at the plot of the derivative of the right-hand-side $B$, for $z \in (0,\frac{1}{3})$, the derivative is maximum at $z=1/3$ and numerically equals $-0.385584$ or $-9(17\sqrt{2}-20)e^{\sqrt{2}-1}/(16 \sqrt{2} (2+\sqrt{2})^{3/2})$. That fact might be provable since the derivative is a rational function times an exponential (of a rational function). Note that the derivative is not monotone on that interval, but it only has 2 local extrema before its right-endpoint and it is maximum at its right endpoint. The second derivative is a sextic polynomial times an exponential divided by squares. Mathematica solves the 2 real roots and 4 complex roots numerically. So in principle, one may check numerically that at the local extrema of the first-derivatives the values of the first-derivatives are no greater than the right-endpoint.

Numerically, by looking at the plot of the left-hand-side $A$, it is concave-down for $z \in (0,\frac{1}{3})$. Therefore the derivative is minimum at $z=1/3$ and its value is $-\frac{9}{2} H_{1/2} + \frac{27}{8} H^{(2)}_{1/2} \approx -0.36498$. Proving this is not something I know how to do directly. There are results about complete monotonicity of $\ln(z)-\frac{1}{2z} + \psi(z)$ which would imply that $\ln(1+\frac{1}{2z})-\frac{1}{2z+1}+\frac{1}{2z} + \psi(z+\frac{1}{2})-\psi(z)$ is concave-down. But that is not exactly the same as $A$.

In any case, accepting these numerical facts, one sees $A\geq 1-0.36498 z$ and $B\leq 1-0.385584 z$ on the interval $(0,\frac{1}{3})$. So that would prove that $\frac{\Gamma(x+\frac{1}{2})}{\Gamma(x)} - \sqrt{x}$ is monotone increasing on all of $x \in [1,\infty)$. (It is difficult to quickly verify if I made any errors. But my main point would be that using Mathematica and plotting as you go along to see if you have lost too much in your inequalities could lead to a valid proof, even if my argument is not valid due to an error somewhere.)