Why use make over a shell script?

Solution 1:

The general idea is that make supports (reasonably) minimal rebuilds -- i.e., you tell it what parts of your program depend on what other parts. When you update some part of the program, it only rebuilds the parts that depend on that. While you could do this with a shell script, it would be a lot more work (explicitly checking the last-modified dates on all the files, etc.) The only obvious alternative with a shell script is to rebuild everything every time. For tiny projects this is a perfectly reasonable approach, but for a big project a complete rebuild could easily take an hour or more -- using make, you might easily accomplish the same thing in a minute or two...

I should probably also add that there are quite a few alternatives to make that have at least broadly similar capabilities. Especially in cases where only a few files in a large project are being rebuilt, some of them (e.g., Ninja) are often considerably faster than make.

Solution 2:

Make is an expert system

There are various things make does that are hard to do with shell scripts...

  • Of course, it checks to see what is out of date, so as to build only what it needs to build
  • It performs a topological sort or some other sort of tree analysis that determines what depends on what and what order to build the out-of-date things such that every prerequisite is built before every dependency, and only built once.
  • It's a language for declarative programming. New elements can be added without needing to merge them into an imperative control flow.
  • It contains an inference engine to process rules, patterns, and dates, and this, when combined with the rules in your particular Makefile, is what turns make into an expert system.
  • It has a macro processor.
  • See also: an earlier summary of make.