Is it possible to solve any Euclidean geometry problem using a computer?
By "problem", I mean a high-school type geometry problem.
If no, is there other set of axioms that allows that? If yes, are there any software that does that?
I did a search, but was not able to find a single program that allows that. It is strange because even if it is impossible to solve any problem, most of the natural problems should be solvable.
There is a method to decide (true or not) any theorem in Euclid's Elements, by translating it into analytic geometry, or multivariate polynomials over the reals.
Of the philosophical controversies discussed by the Greeks that inspired Euclid to set up his axioms, common notions and definitions (like what an angle is, what really is the action of a 'compass'), most of them are de facto resolved by how operations and variables work in real 2D analytic geometry with the Pythagorean theorem (which has been proven to be proof-power equivalent to Euclid's fifth postulate).
The conversion works as follows:
- An undefined point is given two variables, an $x$ and $y$ coordinate $(x_1,y_1)$. Any possible distinct point should be named by some other pair $(x_2,y_2)$ (a proof that these two points coincide would show that $x_1=x_2$ and $y_1=y_2$).
- a circle is defined by center $(x_3,y_3)$ and distance $d_1$ such that $(x-x_3)^2 + (y-y_3)^2 = d_1^2$. If you want a point on that circle, you instantiate this equation with the appropriate $x$ and $y$.
- a line is defined by, well, two points. Just by stating two points, you have a line.
- Frankly, I'm not sure how to do angles, but my memory tells me that it is possible to 'deal' with them.
So now you have a set of polynomial equations in many variables. And all you need is a procedure to find if there is a solution (some set of satisfying numbers or really a nontrivial set of equations) for them.
Suppose you only had points and lines. Then all you'd need is Gaussian elimination. But with circles, you can have arbitrary degree. Groebner basis completion (via Buchberger's algorithm from the mid 70's) is what you can use. Essentially, it does just what you'd expect (but didn't realize you could) on higher degree multivariate polynomials. In analogy with eliminating columns in a matrix, you try to eliminate 'maximal' terms one by one. It makes it easier if you multiply out everything so your equations are all of the form 'a sum of terms' = 0, where a term is a coefficient with a number of variables (possible none) multiplied, e.g. $17x_1^3y_{41}^2d_1^{1729}$. The algorithm assumes an order on these terms, and then follows an analog of the numerical GCD algorithm (coincidentally or not in Euclid).
So the algorithm tries to 'reduce' as much as possible (eliminating leading terms where it can). If it eliminates so much to the point where it has an equation $0=1$, then you know you have a contradiction, and then your equations have no way of being satisfied, so the polynomials that describe the theorem are inconsistent, so the theorem itself does not hold. Otherwise (and this is the beauty of it), the theorem -does- hold.
In some sense you could do this by hand for some very small trivial problems (It allows you to find the intersection of a given circle and line for example). Anyway, the details of the algorithm make it a decision procedure (will stop with an answer of 'yes' or 'no').
As a practical matter, you probably don't want to do this by hand. Groebner basis completion is (depending on what you consider the exact problem) PSPACE-complete, so mostly likely exponential (in the number of variables).
As to Tarski's result that elementary plane geometry is decidable, yes, that is a classic result (from the 50's). The theorem is the the theory of the reals is decidable, which includes the '$\lt$' relation. This includes Euclidean geometry, and is a bit more general. Tarski's algorithm is ingenious in that it solved a long outstanding problem, but is not particularly efficient. Collins' 'cylindrical algebraic decomposition' algorithm also from the mid 70's) is much more efficient (if you consider an increase in efficiency going from a stack of exponentials, to simply maybe just doubly exponential; the second is only merely astronomical, the first terribly so). It is of course still less efficient than Buchberger's algorithm. Because of the different fields these came out of, I'm not sure of any research that discusses any explicit comparison of the two.
Whatever the efficiency is, yes, there are a number of actual theorems proved mechanically in this style. A very nontrivial example is Morley's theorem (the trisectors of the angles of a triangle meet in an equilateral triangle) which Zeilberger has given an automated proof of using Maple. And Wu (of the Wu method mentioned, which is similar to Buchberger's algorithm) and his acolytes have more comprehensively proven whole sets of Euclidean theorems.
An interesting problem would be how to prove Euclidean theorems using Euclid's axioms as is (instead of translating). There has been some recent research by Avigad into that. And I've heard that one can do a translation of Euclid into Clifford algebras that is a bit more 'like' the original Euclid than the analytic approach.
(for anything unexplained, you might try googling or wikipedia with varying degrees of success)
So maybe that's not satisfying for whatever reason. Maybe there are other axiom systems for which one can natively (that is, synthetically, uninterpreted) have a decision procedure (which means write a computer program to do it. Hilbert's axioms are really a 'fix' of Euclid's; that is, it adds betweenness and some other axioms that Euclid, even in his arduous skepticism, left out. So I consider it essentially the same. Then there's Tarski's axioms. These are only mildly different, but still translate to the analytic version. (I am of the opinion that it wasn't a deliberate connection, and maybe an intellectual coincidence, between these axioms and Tarski's decision procedure for real closed fields.
As to computer programs, well there's Maple or Mathematica (or pretty much any computer algebra package that implements Groebner basis calculation); you still have to do the translation yourself into polynomials.
On quite the other hand there are a number of geometry 'editors', that allow you to 'draw' a theorem, allowing you to drag and drop the free parameters as objects, i.e. showing that the bisectors of a triangle meet at a point, allowing you to move all the vertices, showing you the intersection of the pairs of bisectors all meet at the same point. Some of these will allow you to 'prove' your construction (that some points always coincide for instance (Geometer's Sketchpad?). As far as I know, these 'proofs' are not symbolic (like GB calculation) but rather numeric (they notice that the difference among computed values is always less than some very small epsilon).
I believe it is a theorem of Tarski that elementary plane geometry is decidable.
You can express the problem using coordinates. You can then use Gröbner basis techniques to try to prove that the expression representing the conclusion is in the ideal generated by the expressions representing the hypotheses. See also Wu's method.
I just want to comment on one aspect of Mitch's excellent answer (above).
He wrote: "Groebner basis completion (via Buchberger's algorithm from the mid 70's) is what you can use. "
As a practical matter, Groebner bases do not usually work on geometrical problems. The Dixon resultant is far more effective. (I am speaking of symbolic solution.)
Many examples can be found on my web site. For example, the paper on Apollonius problems at
http://fordham.academia.edu/RobertLewis/Papers/82784/Apollonius_Problems_in_Biochemistry