ACCU Home page ACCU Conference Page ACCU 2017 Conference Registration Page
Search Contact us ACCU at Flickr ACCU at GitHib ACCU at Google+ ACCU at Facebook ACCU at Linked-in ACCU at Twitter Skip Navigation

pinMake...or Break?

Overload Journal #1 - Apr 1993 + Programming Topics   Author: Peter Arnold

I remember once reading the claim that writing makefiles for C++ developments is more difficult than writing C++ itself!

An interesting claim which, although made in an article criticising the world at large for making the mistake of thinking there was any value in object-oriented programming in the first place, nevertheless capitalized on the fact that MAKE is all too often regarded as something of a black art.

"I never use it", declared one of my colleagues when I mentioned MAKE:"I always use the Project Manager within the IDE."

In terms of keeping track of dependencies, MAKE does everything Project Manager does, but with greater flexibility and efficiency; and it does more. It has also been "frigged" in numerous ways over the years by various people (presumably including the originator of the above claim!) who try to make it do more still. In this article I look briefly at three examples of flexibility (implicit rules, macros and directives) and one example of improved efficiency (batching).

Suppose we have a general-purpose object library, utils.lib, which contains 50 commonly-used modules for file i/o, screen handling etc. The first entry in a makefile would look something like this:

  utils.lib:  prog1.obj prog2.obj  prog3.obj  prog4.obj \
  prog49.obj  prog50.obj

This type of entry is called an explicit rule with no commands. The dependencies of utils.lib (the target) are explicitly defined to be prog1.obj, prog2.obj, etc. (the sources), but no commands follow the list of dependencies: the real work is done further down the makefile.

Having told MAKE which files the library depends on, we now need to go on to make each .obj file the target in another rule, to define what determines whether the file is up to date, and what to do, if necessary, to bring it up to date:

  prog1.obj:   prog1.cpp utils.h tcc  -c -mc  -f- prog1.cpp tlib utils.lib -+prog1.obj
  prog2.obj:   prog2.cpp utils.h tcc  -c  -mc  -f- prog2.cpp tlib utils.lib -+prog2.obj
  prog50.obj:   prog50.cpp utils.h tcc  -c  -mc  -f-  prog50.cpp tlib utils.lib -+prog50.obj

Each of these is an explicit rule with commands: if any source is newer than the corresponding target, the TCC command will be executed, bringing the .obj file up to date, and the TLIB command will then be executed, adding the new version to the library.

What if we decide to use the small memory model instead of the compact one? Or add other compiler options? All 50 TCC commands will need to be changed. This is where macros can be used very effectively. Where the same text is repeated a number of times in a makefile, a change can be time-consuming and could introduce inconsistencies, so a macro definition can be added at the start of the makefile:

  COMPILE  =  tcc  -c  -mc  -f-

and each of the commands changed to use the macro. For example, the command to compile prog50.cpp becomes

  $(C0MPILE)   prog50.cpp

Now, to change the compiler options, only the macro definition needs to be changed, and consistency is assured.

From time to time, new modules will be created and added to the library. Each new module must be added as a source in the first rule in the makefile, to make the library dependent on it. In addition, a new rule must be added, to tell MAKE what to do to bring the module up to date. So the new module's name must be entered five times: once in the first rule, twice in its own rule, and once in each of the TCC and TLIB commands.

Once again, this repetition can be time-consuming, and risks introducing inconsistencies; also, the makefile will grow and grow.

Fortunately, where rules follow the same simple pattern in this way, MAKE allows implicit rules to be defined. Apart from the first rule (the one for utils.lib), all the rules in our makefile can be replaced by a single implicit rule:

  .cpp.obj:
  $(COMPILE)   $<

This acts as a rule for any .obj file listed as a source in the utils.lib rule, and makes each .obj file dependent on a file of the same name with a .cpp extension. The predefined macro $< expands to the full filename of the .cpp file.

Now, not only has our makefile has just shrunk by 150 lines or so, but also each module is only defined in once place. When we add a new module matching the implicit rule, it only needs to be added to the first rule.

There doesn't appear to be as much use for our user-defined macro COMPILE any more; as the TCC command is now only used in one place, we might as well revert to using the command itself.

Additionally, we can add the TLIB command to put the .obj file in utils.lib using another predefined macro, $&, which expands to the base filename (i.e. the filename minus extension):

  .cpp.obj: tcc -c  -mc -f- $<    tlib utils.lib -+$&

But what about utils.h, which the .obj files depend on? An implicit rule cannot (by definition) have explicit dependencies, so how do we tell MAKE that it must compile prog1 .obj if utils.h changes?

The answer is to use the .autodepend dot directive. This causes MAKE to detect the dependency on utils.h for itself: it does this by looking at information put in the .obj files by TC and TCC recording which include files were used in the compilation.

When MAKE executes commands, a feature known as batching can be used to increase efficiency by executing a command with a list of filenames. If we add braces around the command in our implicit rule:

  .cpp.obj: tcc -c -f-  {$<   }

the braces tell MAKE to delay execution of the command until it has determined whether the next command will be the same, and combine them if so. This takes advantage of the fact that TCC, like many programs, can take a list of filenames on the command line:

  tcc prog1.cpp prog2.cpp prog3.cpp

which is more efficient than:

  tcc prog1.cpp tcc prog2.cpp tcc prog3.cpp

because the compiler does not have to be repeatedly loaded from disk, but remains in memory to compile all three modules.

But what if we modify our makefile to use batching?

  .cpp.obj:
  tcc -c -mc -f- {$< } tlib utils.lib {-+$& }

The two commands (TCC and TLIB) are always invoked one after the other: MAKE will be unable to batch any commands, as no two consecutive commands are the same.

So, having improved our makefile with a combination of an implicit rule, a dot directive and two predefined macros, how can we take advantage of batching when there is more than one command to execute to bring each .obj file up to date?

Solutions are invited, and a selection will be included in a later issue, along with the author's own solution.

Other aspects of MAKE to be covered in future articles include command input redirection, temporary file passing, and more predefined macros and directives. Also, since MAKE's usefulness extends far beyond the realms of compiling and linking, some unusual uses will be examined. Suggestions (small or large) for (ab)using MAKE are invited, whether genuinely useful or merely as an amusing contribution to the world of "frigged" makefiles (which includes such things as adding the makefile's own name to the list of dependencies to force a build when the makefile is edited!).

Overload Journal #1 - Apr 1993 + Programming Topics