Show $\mathbb{E}[f(X)g(X)] \geq \mathbb{E}[f(X)]\mathbb{E}[g(X)]$ for $f,g$ bounded, nondecreasing

Let $X$ be a random variable and let $g,f$ be real-valued, nondecreasing, and bounded.

Show that $\mathbb{E}[f(X)g(X)]\geq \mathbb{E}[f(X)]\mathbb{E}[g(X)]$

Having a hard time seeing where to start for some reasons, any hints?

I managed to write:

$\mathbb{E}[f(X)g(X)] = \frac{1}{4}\mathbb{E}[(f(X)+g(X))^2 - (f(X)-g(X))^2]$

But I'm not sure if this is in the right direction or not.


Solution 1:

Hint: Let $X_1,X_2$ be independent copies of $X$, and note that $$(g(X_1)-g(X_2))(f(X_1)-f(X_2))\geq 0.$$

Solution 2:

The fact itself is pretty intuitive: Two increasing transformations of a random variable are positively correlated.

First note that $X$ may be assumed to have a uniform distribution on $[0,1]$ (via the quantile transformation).

Now let $f(x) = \mathbf{1}_{[x_0,1]}(x)$, $x_0\in[0,1)$. Since $$ h(x) = \frac{x-x_0}{1-x_0}\le x,\quad x\in[x_0,1], $$ we get $$ E[f(X) g(X)] = E[\mathbf{1}_{X\ge x_0} g(X)]\ge E[\mathbf{1}_{X\ge x_0} g(h(X))] \\ = \int_{x_0}^1 g(h(x))dx = (1-x_0)\int_0^1 g(z) dz = P(X\ge x_0)E[g(X)] = E[f(X)] E[g(X)]. $$

By linearity, the desired inequality holds for non-negative non-decreasing step functions $f$. But for a constant $f$ we have an equality, so the inequality holds for non-decreasing step functions $f$ of any sign. The proof is completed by approximating any function with step functions.