history of the contraction mapping technique
The answer depends on what you mean by the "scheme". In your step 4 you mentioned: "the theorem". If you mean by it the general form of the theorem due to Banach, then there cannot be someone earlier than he who can implement the scheme (by definition). The Banach fixed point theorem is first stated as Theorem 6 on pg 330 of this printing of his doctoral dissertation. Unfortunately, despite its suggestive title, the dissertation does not in fact contain an application of using the fixed point theorem to solve integral equations.
But if you relax the meaning of "the theorem", and refer not so much to the abstract version stated and proved by Banach, but to the general idea that "contraction mappings contain fixed points", then Banach is not even close to the first person to make use of the scheme you outlined.
The scheme you outlined, in its general form (even more general than the exact four step scheme you described), is called the method of successive approximations. The earliest of its uses in the Western world is likely (though I am not entirely certain about this) due to I. Newton in his namesake method for finding roots of functions. The convergence in the case of a contraction mapping (since we are now talking about function $f:\mathbb{R}\to\mathbb{R}$) follows easily from the convergence of the geometric series and the comparison test for series convergence, and is nothing special to remark about. One of the more well-known uses of the method of successive approximations is Picard's original proof of the existence and uniqueness of solutions to ordinary differential equations. Despite the modern presentations which almost always pass through Banach's fixed point theorem, Picard "brute-forced" the construction. (Lindelof's improvement on Picard's original theorem was published in 1894, 25 years before Banach even formulated his fixed-point theorem).
But I said "Western world" before. In fact, a form of the method of successive approximations was known to the Babylonians for finding the square root of a positive number, which is very similar to applying Newton's method to the quadratic equation $x^2 - a = 0$. There's a nice blog post by John Baez and Richard Elwes about that.
But back to the method of successive approximations. Using the method to solve problems certainly has a long history. And it is recognized early on that a necessary condition for convergence is that the error term decreases to zero. The problem is that if we don't know about convergence, we don't have a candidate "limit object" to which we can measure the error! So one of the big steps in the development of this method is surely due to Bolzano and Cauchy who recognized the importance of Cauchy sequences. Ignoring the issue of convergence of Cauchy sequences (which is treated by the completeness of metric spaces), the method of successive approximations can be applied with success provided that the iterates form a Cauchy sequence.
Cast in this historical light, perhaps one should think of the development of the Banach fixed point theorem not as inspiring others to solve problems using the method of approximations, but as providing an important, sufficiently general criterion for when the method of approximations can be applied.