Good examples of double induction
I'm looking for good examples where double induction is necessary. What I mean by double induction is induction on $\omega^2$. These are intended as examples in an "Automatas and Formal Languages" course.
One standard example is the following: in order to cut an $n\times m$ chocolate bar into its constituents, we need $nm-1$ cuts. However, there is a much better proof without using induction.
Another example: the upper bound $\binom{a+b}{a}$ on Ramsey numbers. The problem with this example is that it can be recast as induction on $a+b$, while I want something which is inherently inducting on $\omega^2$.
Lukewarm example: Ackermann's function, which seems to be pulled out of the hat (unless we know about the primitive recursive hierarchy).
Better examples: the proof of other theorems in Ramsey theory (e.g. Van der Waerden or Hales-Jewett). While these can possibly be recast as induction on $\omega$, it's less obvious, and so intuitively we really think of these proofs as double induction.
Another example: cut elimination in the sequent calculus. In this case induction on $\omega^2$ might actually be necessary (although I'm not sure about that).
The problem with my positive examples is that they are all quite technical and complicated. So I'm looking for a simple, non-contrived example where induction on $\omega^2$ cannot be easily replaced with regular induction (or with an altogether simpler argument). Any suggestions?
A nice example arises by relativizing Goodstein's Theorem from $\rm\ \epsilon_0 = \omega^{\omega^{\omega^{\cdot^{\cdot^{\cdot}}}}}$ down to $\rm\ \omega^2\:.\ $
$\rm\ \omega^2\ $ Goodstein's Theorem $\ $ Given naturals $\rm\ a,\:b,\:c\ $ and an arbitrary increasing "base-bumping" function $\rm\ g(n)\ $ on $\:\mathbb N\:$ the following iteration eventually reaches $0\ $ (i.e. $\rm\ a = c = 0\:$).
$\rm\quad\quad\quad\quad\ \ a\ b + c \ \ \to\quad\quad\ \ a\ \ \ \ \ g(b)\ +\ \ \ c\ \ -\ 1\quad if\quad\ c > 0 $
$\rm\quad\quad\quad\quad\ \ \phantom{a\ b + c}\ \ \to\ \ (a-1)\ g(b)\ +\ g(b)-1\quad if\quad \ c = 0 $
Note: $\ $ The above iteration is really on triples $\rm\ (a,b,c)\ $ but I chose the above notation in order to emphasize the relationship with radix notation and with Cantor Normal form for ordinals < $\epsilon_0$. $\ \ $ For more on Goodstein's Theorem see the link in Andres's post or see my 1995\12\11 sci.math post.
Let me begin with an example of an induction of length $\epsilon_0$: The proof that Goodstein sequences terminate. I mention this because when I decided to understand this result, I began to compute the length of these sequences and eventually came to a conjecture for a general formula (!) for the length of the sequence. It turned out that proving the conjecture was easy, because the proof organized itself as an induction of length $\epsilon_0$. I was both very amused and very intrigued by this. The little paper that came out of this adventure is here.
Now, I also found once a natural example of an induction of length $\omega^2$ when studying a "Ramsey type" problem: the size of min-homogeneous sets for regressive functions on pairs. What I liked about this example is that Ackermann's function injected itself into the picture and ended up providing me with the right rates of growth. The details are in a paper here.
Though you've already dismissed it as 'lukewarm', Ackermann's function (proving totality I think is what is wanted) is your most accessible option (I think it is a great option).
It's not contrived/unnatural because it is motivated by very different concepts. If you want to construct another example (prove $f(x,y) = g(x,y)$), you'd probably want to have x and y very much asymmetric (in the sense that they should be used in syntactically very different ways in the computations). And Ackermann's function does just that.