Convex functions in integral inequality

Let $\mu,\sigma>0$ and define the function $f$ as follows: $$ f(x) = \frac{1}{\sigma\sqrt{2\pi}}\mathrm \exp\left(-\frac{(x-\mu)^2}{2\sigma ^2}\right) $$ How can I show that $$ \int\limits_{-\infty}^\infty x\log|x|f(x)\mathrm dx\geq \underbrace{\left(\int\limits_{-\infty}^\infty x f(x)\mathrm dx\right)}_\mu\cdot\left(\int\limits_{-\infty}^\infty \log|x| f(x)\mathrm dx\right) $$ which is also equivalent to $\mathsf E[ X\log|X|]\geq \underbrace{\mathsf EX}_\mu\cdot\mathsf E\log|X|$ for a random variable $X\sim\mathscr N(\mu,\sigma^2).$


Solution 1:

Below is a probabilistic and somewhat noncomputational proof.

We ignore the restriction to the normal distribution in what follows below. Instead, we consider a mean-zero random variable $Z$ with a distribution symmetric about zero and set $X = \mu + Z$ for $\mu \in \mathbb R$.

Claim: Let $X$ be described as above such that $\mathbb E X\log|X|$ is finite for every $\mu$. Then, for $\mu \geq 0$, $$ \mathbb E X \log |X| \geq \mu \mathbb E \log |X| \> $$ and for $\mu < 0$, $$\mathbb E X \log |X| \leq \mu \mathbb E \log |X| \>.$$

Proof. Since $X = \mu + Z$, we observe that $$ \mathbb E X \log |X| = \mu \mathbb E \log |X| + \mathbb E Z \log |\mu + Z| \>, $$ and so it suffices to analyze the second term on the right-hand side.

Define $$ f(\mu) := \mathbb E Z \log|\mu+Z| \>. $$

Then, by symmetry of $Z$, we have $$ f(-\mu) = \mathbb E Z \log|{-\mu}+Z| = \mathbb E Z \log|\mu-Z| = - \mathbb E \tilde Z \log|\mu + \tilde Z| = - f(\mu) \>, $$ where $\tilde Z = - Z$ has the same distribution as $Z$ and the last equality follows from this fact. This shows the $f$ is odd as a function of $\mu$.

Now, for $\mu \neq 0$, $$ \frac{f(\mu) - f(-\mu)}{\mu} = \mathbb E \frac{Z}{\mu} \log \left|\frac{1+ Z/\mu}{1- Z/\mu}\right| \geq 0\>, $$ since $x \log\left|\frac{1+x}{1-x}\right| \geq 0$, from which we conclude that $f(\mu) \geq 0$ for all $\mu > 0$.

Thus, for $\mu > 0$, $\mu \mathbb E \log |X|$ is a lower bound on the quantity of interest and for $\mu < 0$, it is an upper bound.

NB. In the particular case of a normal distribution, $X \sim \mathcal N(\mu,\sigma^2)$ and $Z \sim N(0,\sigma^2)$. The moment condition stated in the claim is satisfied.

Solution 2:

Change the dummy variable $x\mapsto x+\mu$: $$ \begin{align} \int_{-\infty}^\infty x\log|x|f(x)\,\mathrm{d}x &=\int_{-\infty}^\infty(x+\mu)\log|x+\mu|f(x+\mu)\,\mathrm{d}x\\ &=\mu\int_{-\infty}^\infty\log|x+\mu|f(x+\mu)\,\mathrm{d}x\\ &\phantom{=}+\int_{-\infty}^\infty x\log|x+\mu|f(x+\mu)\,\mathrm{d}x\\ &=\mu\int_{-\infty}^\infty\log|x|f(x)\,\mathrm{d}x\\ &\phantom{=}+\int_{-\infty}^\infty x\log|x+\mu|f(x+\mu)\,\mathrm{d}x\tag{1} \end{align} $$ Next $$ \begin{align} \varphi(x) &=\int_{-\infty}^\infty x\log|x|f(x)\,\mathrm{d}x-\mu\int_{-\infty}^\infty \log|x|f(x)\,\mathrm{d}x\\ &=\int_{-\infty}^\infty x\log|x+\mu|f(x+\mu)\,\mathrm{d}x\\ &=\frac{1}{\sigma\sqrt{2\pi}}\int_{-\infty}^\infty x\log|x+\mu|\;e^{-\frac12(x/\sigma)^2}\,\mathrm{d}x\\ &=\frac{\sigma}{\sqrt{2\pi}}\int_{-\infty}^\infty x\log|x+\mu/\sigma|\;e^{-x^2/2}\,\mathrm{d}x\\ &=\frac{\sigma}{\sqrt{2\pi}}=\hspace{-11.5pt}\int_{-\infty}^\infty\frac{1}{x+\mu/\sigma}\;e^{-x^2/2}\,\mathrm{d}x\tag{2} \end{align} $$ $\varphi(x)$ is an odd function of $\mu$, which is positive when $\mu>0$.

First note that $$ \begin{align} \varphi(\mu)+\varphi(-\mu) &=\frac{\sigma}{\sqrt{2\pi}} =\hspace{-11.5pt}\int_{-\infty}^\infty\frac{2x}{x^2-(\mu/\sigma)^2}\;e^{-x^2/2}\,\mathrm{d}x\\ &=0\tag{3} \end{align} $$ since $(3)$ is the integral of an odd function times an even function. Therefore, $\varphi$ is an odd function.

Furthermore, since $\varphi$ is odd, for $\mu>0$, $$ \begin{align} 2\varphi(\mu) &=\varphi(\mu)-\varphi(-\mu)\\ &=\frac{\sigma}{\sqrt{2\pi}}=\hspace{-11.5pt}\int_{-\infty}^\infty\frac{1}{x}\;\left(e^{-(x-\mu/\sigma)^2/2}-e^{-(x+\mu/\sigma)^2/2}\right)\,\mathrm{d}x\\ &=\frac{\sigma}{\sqrt{2\pi}}=\hspace{-11.5pt}\int_{-\infty}^\infty\frac{2}{x}\sinh\left(\frac{x\mu}{\sigma}\right)e^{-\frac12\left(x^2+(\mu/\sigma)^2\right)}\,\mathrm{d}x \tag{4} \end{align} $$ Thus, the integrand in $(4)$ is positive for $\mu>0$. Therefore, $\varphi(\mu)>0$ for $\mu>0$.

Thus, your inequality is true for $\mu>0$.

Solution 3:

$\int x log \vert x \vert f(x) dx - \mu \int log \vert x \vert f(x) dx = \int( x - \mu) log \vert x \vert f(x) dx = \int x log \vert x + \mu \vert \phi(x) dx = \int_0^{\infty} x log(\vert \frac {\mu +x}{\mu - x}\vert )\phi(x) dx$ and the integrand is positive. $\phi$ is symmetric is all that gets used.