Wanted: A simple and didactically optimized integration theory
For those interested, I may now have found an answer, which is quite satisfactory in the above sense and perhaps of some value in the teaching of elementary analysis:
Start with Weierstrass' Approximation Theorem: A continuous function $f$: $[0, 1]\longrightarrow\mathbb{R}$ is the uniform limit of polynomials $(p_n)$. Admittedly, this is not trivial to prove, but it can be convincingly used without proof or proved nicely by writing down the concrete sequence of Bernstein polynomials, and the proof also isolates away uniform continuity arguments so they don't need to be invoked in further construction.
The obvious idea to set $\int f :=\lim_n\int p_n$ is no good as it requires us to evaluate integrals of powers $x^m$ via Riemann sums (s. above formula) - this isn't easy at all, but there is a nice way round it via the exponential function: Note that if $p_n$ converges uniformly to $f$, then $p_n\circ\exp$ converges uniformly to $f\circ\exp$ - this draws upon the fact that a continuous bijection on a compact interval has a continuous inverse. So continuous functions on $[0, 1]$ are uniform limits of polynomials composed with $\exp$, or consequently, linear combinations of exp-functions.
The above boils it all down to the computation of integrals of $e^{cx}$ (some $c>0$) via Riemann sums, but this is easy since $\sum_{k=0}^{n-1} e^{ck}=\frac{1-e^{cn}}{1-e^c}$ by telescoping and all one needs is the knowledge that $\lim_{x\longrightarrow 0}\frac{e^x-1}{x}=1$, which is provable by simple algebraic manipulation.
Extension to $[a, b]$ from $[0, 1]$ is easily done by linear substitution.
This notion of integral (which so far works for continuous functions only but is good enough for a primitive version of the Fundamental Theorem of Calculus) can then be further extended in the obvious way to piecewise continuous functions and then subsequently to uniform limits of such. The latter in fact turn out to be the precisely the regulated functions (but that's where some Heine-Borel type compactness argument is required at the latest).
This question just popped up again from an edit and I was rather surprised that, considering the amount of interest, no one has suggested a rather obvious integration theory that also answers the question.
We want a "teaching integral" that is adequate to get the students started on a study of integration theory but not overwhelm them with the correct modern theories. Some do suggest the Henstock-Kurzweil integral. This makes sense only if the students have already studied the rather pathetic Riemann integral. We say, then, "Here is a better theory, no harder than what you just learned, and much more powerful if you care to examine it further." But the Henstock-Kurzweil integral is a solution to a different problem. It is not "integration with training wheels." It is a full theory of integration that includes the Riemann integral, the Lebesgue integral, the improper Riemann integral, and the Denjoy-Perron integral.
Some of us feel that a return to the Newton integral is a better introduction to integration theory as a first step. After all, this is the way that every 18th century mathematician viewed integration theory.
Definition A function $f:[a,b]\to\mathbb R$ is Newton integrable on $[a,b]$ if there is an antiderivative $F$ (i.e., $F'(x)=f(x)$ for all $a\leq x \leq b$) and then one defines $$\int_a^b f(x)\,dx=F(b)-F(a).$$
The justification for the integral is simply the mean-value of the calculus. The basic properties of the integral are all deduced from properties of derivatives. Riemann sums come in by way of the mean-value theorem too.
I think if you did a poll of calculus students who have been force-fed the Riemann integral, nearly all of them would say that this is indeed all that they consider integration to be. It is how they all calculate integrals --although they remember vaguely some unpleasant times spent using Riemann sums, but mercifully no-one makes them do that stuff any more.
Does this integral handle all continuous functions? You can do exactly as Cauchy did and show how to construct the primitive of continuous functions using Riemann sums. I prefer myself to prove this lemma:
Lemma Let $f:[a,b]\to\mathbb R$ be a bounded function. Then there is a Lipschitz function $F:[a,b]\to\mathbb R$ so that $F'(x)=f(x)$ at every point $x$ at which $f$ is continuous.
It is not that hard and can be presented at an elementary level if you don't demand the students follow in detail every step. You don't need uniform continuity evidently, but you do need to construct a sequence of functions that converges to the function you want.
For more information about how one might possibly take this point of view for an introductory course in integration theory you can consult this experimental textbook.