Prove or disprove that if $\lim\limits_{x\to0^+}f(x)=0$ and $|x^2f''(x)|\leq c$ then $\lim\limits_{x\to0^+}xf'(x)=0$

A function $f$ defined on interval $(0,1)$ with a continuous twice derivation $(f\in{C^2(0,1)})$ satisfies $\lim_{x\to0^+}f(x)=0$ and $|x^2f''(x)|\leq{C}$ where $C$ is a fixed positive real number.

Prove $\lim_{x\to0^+}xf'(x)=0$ (or disprove it!)

I've tried several ways like calculating $yf'(y)-xf'(x)=f(y)-f(x)+\int_{x}^{y}tf''(t)\,dt$ to prove the limitation exists (and failed).

I also think it is similer to L'Hospital rule $\lim_{x\to0^+}\frac{f'(x)}{\frac{1}{x}}=\lim_{x\to0^+}\frac{f''(x)}{-\frac{1}{x^2}}$,but it's only a sufficient condition and probably wrong.


For $0 < y < x < 1$, by Taylor's theorem there exists $\theta \in (0,1)$ such that

$$f(y) = f(x) + f'(x)(y-x) + \frac{1}{2} f''(x - \theta(x-y)) (y-x)^2$$

Taking $y = (1-\eta) x$ where $0 < \eta < 1/2$ we have

$$f(y) - f(x) = -\eta xf'(x) +\frac{\eta^2}{2}x^2f''(x(1 -\theta\eta)) $$

and since $y \to 0+$ as $x \to 0+$,

$$0 = \lim_{x \to 0+}\frac{f(y) - f(x)}{\eta} = \lim_{x \to 0+}\left(-xf'(x)+ \frac{\eta}{2}\frac{1}{(1 - \theta\eta)^2} [x(1-\theta\eta)]^2 f''(x(1 -\theta\eta)) \right) $$

Since the limit on the RHS is $0$, for any $\epsilon > 0$ if $0 < x < \delta$ we have

$$\tag{*}|x f'(x)| \leqslant \epsilon + \frac{\eta}{2(1 - \theta\eta)^2} [x(1-\theta\eta)]^2 |f''(x(1 -\theta\eta))| \leqslant \epsilon + \frac{\eta}{2(1 - \theta\eta)^2}C,$$

Since $0 < \eta < 1/2$, we have

$$\frac{\eta}{(1- \theta \eta)^2} < \frac{\eta}{(1 - \theta/2)^2} < 4\eta,$$

and the last term on the RHS of (*) can be made arbitrarily small.

It follows that

$$\lim_{x \to 0+} x f'(x) = 0$$


Let $g(x)=f(1/x)$ for $x\in (1,\infty)$. The hypotheses on $f(x)$ imply that $$\lim_{x \rightarrow \infty} g(x)=0~\mbox{and}~|(x^2g'(x))'|<C,$$ since $(x^2g'(x))'=x^2 g''(x)+2x g'(x)=\frac{1}{x^2}f''(1/x)$ which is bounded in absolute value by $C$ by the assumption.

Define $h(x)=xg'(x)$, and so we need to show that $\lim_{x \rightarrow \infty}h(x)=0$, since $\frac{1}{x}f'(1/x)=-xg'(x)=-h(x)$. We first show that $h(x)$ is bounded on $[2,\infty)$. On the contrary, suppose $h(x)$ is unbounded and so there exists a sequence $x_i \rightarrow \infty$ such that $|h(x_i)| >i$ for all $i$. Also, wlog we can assume that $h(x_i)$ are all positive or all negative. WLOG, suppose that $h(x_i)>0$ for all $i$, since the proof is similar in the other case. For each $i$ large enough, choose $y_i \in [2,x_i]$ such that $h(y_i)>i$ and $h'(y_i)>0$. This is possible, since for $i$ large enough $h(x_i)>h(2)$ and so $y_i=\inf_{t\in [2,x_i]}\{h(t)=h(x_i)\}$ works. But, one has $$|h(x)+xh'(x)|=|(xh(x))'|=|(x^2g'(x))' |<C,$$ and replacing $x$ by $y_i$ and letting $i \rightarrow \infty$ results in a contradiction. So $h(x)$ is bounded on $[2,\infty)$.

Next, we show that $\lim_{x\rightarrow \infty}h(x)$ exist. Let $\alpha=\limsup_{x\rightarrow \infty} h(x)$ and $\beta=\liminf_{x \rightarrow \infty}h(x)$. Then $\alpha, \beta \in \mathbb{R}$, and we need to show that $\alpha=\beta$. On the contrary, suppose $\alpha>\beta$. Assume that $\alpha > 0$. Let $x_i \rightarrow \infty$ such that $h(x_i) \rightarrow \alpha$. Choose $\theta>0$ such that $\beta<\theta \alpha<\alpha$. By the intermediate-value theorem, for $i$ large enough, there exists $t_i>0$ such that $h(x_i+t_i)=\theta h(x_i)$. In addition, we can choose $t_i$ such that $h(x)>\theta \alpha/2$ for all $x\in [x_i, x_i+t_i]$.

From $|(xh(x))'|<C$ and the mean-value theorem, we have $$|x(h(x+t)-h(x))+th(x+t)|=|(x+t)h(x+t)-xh(x)|<Ct$$ for all $t\geq 0$ and $x>1$. It follows that $$x|h(x+t)-h(x)|<At,$$ for some $A>0$ and all $t\geq 0$ and all $x>1$. Replacing $x$ with $x_i$ and $t$ by $t_i$ gives $$\frac{t_i}{x_i}>\frac{1}{A}|h(x_i+t_i)-h(x_i)|>\frac{1}{A}(1-\theta)h(x_i)$$ for $i$ large enough. It follows that $$\frac{x_i+t_i}{x_i}=1+\frac{t_i}{x_i}>D,$$ for some $D>1$ and all $i$ large enough. Now, one has

$$g'(x)= \frac{h(x)}{x} >\frac{\theta \alpha}{2x},$$ for all $x\in [x_i,x_i+t_i]$. It follows that $$|g(x_i+t_i)-g(x_i)|\geq \int_{x_i}^{x_i+t_i} \frac{\theta \alpha}{2x}dx\geq \frac{\theta \alpha}{2} \log \left (\frac{x_i+t_i}{x_i} \right) \geq \frac{\theta \alpha}{2}\log D,$$ for all $i$ large enough. This is a contradiction, since $\lim_{x \rightarrow \infty}g(x)=0$. The proof in the case of $\beta<0$ is similar. Therefore, $\alpha=\beta$ and so $L=\lim_{x\rightarrow \infty}h(x)$ exists.

Finally, we show that $L=0$. Suppose $L>0$. Then for $x$ large enough $|g'(x)|=|h(x)/x|>L/(2x)$ which gives $g(x)>g(x_0)+ (L/2) \log(x)$ for some $x_0$ and all $x$ large enough, a contradiction. The case where $L<0$ is similar. Therefore, $L=0$ and the proof is completed.