Solution 1:

Let $p = (a + b)/2$ and $2h = b - a$ so that $a = p - h, b = p + h$. We further define the functions $g(t)$ and $r(t)$ by $$g(t) = \int_{p - t}^{p + t}f(x)\,dx - t\{f(p - t) + f(p + t)\},\,\, r(t) = g(t) - \left(\frac{t}{h}\right)^{3}g(h)$$ Then we can see that $$g'(t) = -t\{f'(p + t) - f'(p - t)\},\,\, r'(t) = g'(t) - \frac{3t^{2}}{h^{3}}g(h)$$ By Mean Value theorem we can see that $$ g'(t) = -2t^{2}f''(t')$$ for some $t' \in (p - t, p + t)$. Thus we have $$r'(t) = -t^{2}\left(2f''(t') + \frac{3}{h^{3}}g(h)\right)$$ Clearly we can see that $r(0) = r(h) = 0$ so that (by Rolle's Theorem) there is some point $t_{0} \in (0, h)$ such that $r'(t_{0}) = 0$. This means that $$-t_{0}^{2}\left(2f''(t') + \frac{3}{h^{3}}g(h)\right) = 0$$ and therefore we have $$g(h) = -\frac{2h^{3}}{3}f''(t')$$ where $t' \in (p - t_{0}, p + t_{0}) \subset (p - h, p + h) = (a, b)$. We finally arrive at (by putting values of $h = (b - a)/2,\, p - h = a,\, p + h = b$ and definition of $g(t)$) $$\int_{a}^{b}f(x)\,dx = \frac{b - a}{2}\{f(a) + f(b)\} - \frac{(b - a)^{3}}{12}f''(t')$$ where $t' \in (a, b)$

Note: This is based on an exercise problem in G. H. Hardy's "A Course of Pure Mathematics". Compared to all the usual proofs given on Numerical Analysis books (primarily based on various interpolation formulas and Taylor series) I find this proof by Hardy to be the simplest one.