Rolle's theorem $\beta \cdot f(x)+f'(x)=0$

Prove that if $f$ is differentiable on $[a,b]$ and if $f(a)=f(b)=0$ then for any real $\beta$ there is an $x \in (a,b)$ such that $\beta \cdot f(x)+f'(x)=0$. (Using rolle's theorem)

My attempt:

Using Rolle's theorem we can say that there exists some $c \in (a,b)$, where $f'(c)=0$ . Therefore one factor in the expression $\beta \cdot f(x)+f'(x)=0$ is $0$ at $c$ but I am unable to prove that the other factor will simultaneously be $0$ at $c$.


Solution 1:

Apply Rolle's Theorem to $g(x) = e^{\beta x} f(x)$. Then $g(a) = g(b) = 0$, so that $$ 0 = g'(x) = e^{\beta x}(\beta f(x) + f'(x)) $$ for some $x \in (a, b)$. The conclusion follows since $e^{\beta x}$ is never zero.

How could one come up with that solution?

Recognize that Rolle's theorem can not only by applied to the function $f$, but also to $g = h\cdot f$ where the function $h$ is continuous on $[a, b]$ and differentiable on $(a, b)$. Then $$ g'(c) = h'(c) f(c) + h(c) f'(c) = 0 $$ for some $c \in (a, b)$. This leads to the desired solution if $h'(c) = \beta h(c)$. But $c$ is not known, so the next idea is to require that $h'(x) = \beta h(x)$ for all $x \in (a, b)$. That is a well-known differential equation with solutions $h(x) = C e^{\beta x}$.

Another option is to start with a proof by contradiction. Assume without loss of generality that $f(x_0) > 0$ for some $x_0 \in (a, b)$, and let $I \subset (a, b)$ be the largest interval containing $x_0$ on which $f$ is strictly positive. If $\beta f(x) + f'(x) > 0$ for all $x \in (a, b)$ then $$ \frac{f'(x)}{f(x)} +\beta > 0 $$ for all $x \in I$. It follows by integration that $$ e^{\beta x} f(x) \ge e^{\beta _0} f(x_0) > 0 $$ for all $x \in I$ with $x \ge x_0$. This shows that the right endpoint of the maximal interval $I$ must be equal to $b$, and then we get a contradiction to $f(b) = 0$. This suggests to investigate the function $ e^{\beta x} f(x)$ in the first place.