I noticed that when I make changes to some files and then I type make, it will run certain commands related to those files. If I don't change anything, then make doesn't do anything, saying that the program is up to date. This tells me that make has a way of knowing which files were changed since it was last run. How does it know? It doesn't seem to put anything in the directory where it is run, so it must be storing this information somewhere else.
It inspects the file system's modification date meta information.
See, for instance, the stat() man page and the st_mtime
member of the struct stat
.
It has built-in rules that tells it that (for instance) a .o file needs to be re-generated if the corresponding .c file has changed; the manual section on rule syntax says:
The criterion for being out of date is specified in terms of the prerequisites, which consist of file names separated by spaces. (Wildcards and archive members (see Archives) are allowed here too.) A target is out of date if it does not exist or if it is older than any of the prerequisites (by comparison of last-modification times).
It checks is to see if the date/time stamp on the source file is later than that on the corresponding intermediate file (or perhaps the output file - it's been a while since I dealt with make files). If it is then the file needs to be complied. This will then trigger the linking of the final executable.
make
determines if target file X needs to be rebuilt, by checking to see if its modification time (as recorded in the filesystem) is older than any of its dependencies. For example, if you had a make rule like:
foo.o: foo.c foo.h common.h
Then it knows it needs to rebuild foo.o
if any of foo.c
, foo.h
or common.h
have a newer modification time than foo.o
. This means that executing touch common.h
would force foo.o
to be rebuilt at the next make.
This means that make
can be confused if the modification times are unreliable - for example, if your system clock has jumped backwards, or if you are storing your files on certain network filesystems and have multiple clients accessing them (particularly if the clocks on the various machines on your network are not in synch). If you're using make
with files distributed over a network, it's generally a good idea to run NTP to keep your clock set correctly.
Make
is a lot more sophisticated than it appears on the surface. It has an inference engine with a sophisticated algorithm; when combined with your configuration data make
is a type of program+database known as an expert system.
Anyway, to answer your question, the check make
does is a little bit tricky because there can be multiple chains of prerequisites, and those prerequisite chains can themselves be cross-linked in arbitrary ways.
So, the answer to "how does it know?" is:
- make
stat(2)
's the files to get their modification times - make then does a topological sort on the graph it made out of the Makefile
You are right, make
doesn't normally leave any kind of cookie or other state-tracing file around. (Occasionally some Makefile's do themselves create timestamp cookies, but that's rare.) But that doesn't matter, after all, the last Make run might be unsuccessful anyway. What it does is simply compare the dates on the target files it is making with the dates on the source files. It only cares if the targets are up-to-date, it doesn't normally factor in when make
was last run.