Why $\int_a^{b} xf(x) dx < \frac{1}{b-a} \int_a^{b} x dx$? [closed]
I've seen $\displaystyle \int_a^{b} xf(x) dx < \frac{1}{b-a} \int_a^{b} x dx$ in a solution of finding variance of a random variable. And I don't know why it holds. Is it even correct? Can you help me understand this?
It seems that it is incorrect. Consider the following counter-example: Let $a=0,$ $b=1$. Given an arbitrary small $\varepsilon>0$, we can choose a function $f:[0,1]\rightarrow[0,\infty)$ such that $f$ is continuous,$\int_{0}^{1}f(x)dx=1,$ and $f$ vanishes outside $(1-\varepsilon,1]$. In short $f(x)dx$ is approximately the Dirac measure $\delta_{1}$ at $1$. If we compute directly, we obtain$\int_{0}^{1}xf(x)dx\approx1$ while $\frac{1}{1-0}\int_{0}^{1}xdx=\frac{1}{2}$. Hence, it is false that $\int_{a}^{b}xf(x)dx<\frac{1}{b-a}\int_{a}^{b}xdx$.