Let $f,g:[a,b]\to\mathbb{R}$ be smooth and integrable. Then, there exists an $x_0\in[a,b]$ with

$$ \int_a^b f(x)g(x)dx=f(x_0)\cdot\int_a^b g(x)dx.$$

Is there any way of approximating $x_0$, without evaluating $\int_a^b f(x)g(x)dx$?

We may assume $g$ to be positive and monotone increasing and $\int_a^b g(x)dx$ to be known.


Setting $g(x)\equiv1$, we obtain $$ \int_a^b f(x)dx=f(x_0)\cdot(b-a),$$ so finding this $x_0$ numerically may be a strong tool in approximating any integral?


Solution 1:

Going off Alex's comment: we are looking for $x_0$ such that $$f(x_0)=\frac{\int_a^b f(x)g(x)\,dx}{\int_a^b g(x)\,dx}.$$ The right hand side of the equality is the average of $f$ with respect to $g$ (or you could thing of $g$ as a density function for the wire starting at $a$ and ending at $b$). For example, taking $g=1$ as in the question, we get precisely the average value of $f$ on $[a,b]:$ $$f(x_0)= \frac{1}{b-a}\int_a^b f(x)\,dx.$$ Unfortunately, even with such a simple $g$, this heavily depends on choice of $f$. For example, pick any $x_0$ in $(a,b);$ you can find $f$ depending on $x_0$ such that $f(x_0)$ equals the average of $f$ on $[a,b].$ Even assuming $f$ is monotone increasing won't be enough: the claim of the previous sentence still holds.

I think David's suggestion of a root-finding algorithm might work (depending on $f$), but there may be cases it doesn't converge (like Newton's method). Note that there may be multiple solutions for $x_0$ and such an algorithm may oscillate between them in some way.