If $A$ and $B$ are positive constants, show that $\frac{A}{x-1} + \frac{B}{x-2}$ has a solution on $(1,2)$
I have problem which I couldn't figure out how to solve; If $A$ and $B$ are positive constants, show that $$0=\frac{A}{x-1} + \frac{B}{x-2}$$ has a solution on the open interval $(1,2)$.
If you support your answers with rigorous proof, I appreciate that.
What I thought was taking interval roughly close to the end points from the inside i.e $[1.1,1.9]$, but then it wouldn't be rigorous solution to this problem. After that, I totally stuck since I couldn't determine to closed interval, which prevented me from using any useful theorem.
Note: The problem is taken from G.Simmons Calculus with Analytic Geometry 2nd.
Let $$f(x) = \frac{A}{x - 1} + \frac{B}{x - 2}.$$
Observe that $f$ is defined on $\mathbb R \setminus \{1,2\}$ and it's continuous since it's a sum of continuous functions.
Now,
- $\lim\limits_{x \to 1^+} f(x) = +\infty$
- $\lim\limits_{x \to 2^-} f(x) = -\infty$
Therefore, from the definition of limit and the intermediate value theorem, it follows that $f$ has a root on $(1, 2)$.
It is important to note that the result follows from the definition of limit as well, because the intermediate value theorem requires $f$ to be continuous over a compact $[a, b]$. Indeed, we have $$\forall \varepsilon > 0, \exists \delta \text{ such that } x - 1 < \delta \implies f(x) > \varepsilon$$ And similarly, from the second limit, $$\forall \varepsilon' > 0, \exists \delta' \text{ such that } 2 - x < \delta' \implies f(x) < -\varepsilon'$$ So it's possible to choose appropriate constants $\delta,\delta'$ such that $f$ satisfies the conditions of the intermediate value theorem on $[1 + \delta, 2 - \delta']$ and $f(1 + \delta') > 0$ and $f(2 - \delta) < 0$.
The graph may make the reasoning clearer ($A = 2$ and $B = 1$):
You may also be interested in this very similar question, which has a similar (albeit a bit more involved) solution.
In the interval $(1,2)$ you may freely multiply by $(x-1)(x-2)$ and
$$A(x-2)+B(x-1)=0$$ or
$$(A+B)x=2A+B,$$ $$x=\frac{2A+B}{A+B}=1+\frac A{A+B}.$$
Clearly,
$$0<\frac A{A+B}<1,$$ which substantiates the claim.