tags:

views:

1377

answers:

4

Background

I am just getting started with C++ programming on LINUX. In my last question, I asked about best practices of using makefiles for a big application. "SO" users suggested to read Miller's paper on recursive makefiles and avoid makefile recursion (I was using recursive makefiles).

I have followed miller and created a makefile like the below. Following is the project structure

root
...makefile
...main.cpp
...foo
......foo.cpp
......foo.h
......module.mk

My makefile looks like the below

#Main makefile which does the build

CFLAGS =
CC = g++
PROG = fooexe

#each module will append the source files to here
SRC :=

#including the description
include foo/module.mk

OBJ := $(patsubst %.cpp, %.o, $(filter %.cpp,$(SRC))) main.o

#linking the program
fooexe: $(OBJ)
    $(CC) -o $(PROG) $(OBJ)

%.o:
    $(CC) -c $(SRC)

main.o:
    $(CC) -c main.cpp

depend:
    makedepend -- $(CFLAGS) -- $(SRC)

.PHONY:clean
clean:
    rm -f *.o

Here is the module.mk in foo directory.

SRC += foo/foo.cpp

When I run make -n, I get the following output.

g++ -c  foo/foo.cpp
g++ -c main.cpp
g++ -o fooexe  foo/foo.o main.o

Questions

  • Where should I create the object(.o) files? All object files in a single directory or each object files in it's own modules directory? I mean which is the best place to generate foo.o? Is it in foo directory or the root (My example generates in the root)?
  • In the provided example, g++ -c foo/foo.cpp command generates the .o file in the root directory. But when linking(g++ -o fooexe foo/foo.o main.o) it is looking for the foo/foo.o. How can I correct this?

Any help would be great

+1  A: 

The best thing you can do for yourself is to use something better than Make. SCons is my tool of choice on POSIX systems. Boost also has a build tool that is very flexible, but I had a hard time wrapping my head around it.

Oh, and if you want to use make, go ahead and build recursive makefiles. It really isn't that big a deal. I worked on a gigantic project using tons of recursive makefiles over the last three years, and it worked just fine.

Ben Collins
For internal development its worth investigating other build systems, but if you plan on releasing your source code to a wider audience, make (or autoconf/automake) is the way to go, especially if you are targeting Linux. I love Boost, but there build system drives me up the wall. Not because its bad, but because its different. Everytime I need to do something out of the ordinary, I need to dig out the docs.
KeithB
One problem with recursive makefiles is that it breaks parallel builds. If there isn't one process that knows all the dependancies, you can't reliably compile multiple files at once. This is going to become more of an issue as 4-core (and 8, 16, etc in a few years) become standard.
KeithB
Recursive makefile projects only "break" parallel builds in the sense that it takes more effort to manage dependencies. It's certainly possible (and not terribly difficult) to do recursive makefile projects and still manage dependencies correctly; make just can't do as much of the work for you in that case. I still think it's just not that big a deal, and that warnings against recursive makefile projects are overblown.
Ben Collins
@KeithB: I take your point, but I just don't think that matters much. Most systems are as likely to have Python installed as they are make, and you can wrap up your scons setup in a shell script if you have to to make it work. I would never choose to use Make again, given the choice.
Ben Collins
+1 for SCons. I am using SCons, too. It scans the dependencies for you and it only rebuilds when source has changed as it uses cryptographic hash sums instead of timestamps.
lothar
@Ben Collins: "Most systems are as likely to have Python installed as they are make".That maybe true if you are targeting desktop systems, but not so much for what I do (scientific computing on various flavors of clusters). Add in compilers other than gcc, multiple versions of libraries, etc. and it gets to be a pain. If I have to deal with make and SCons and bjam and whatever else someone comes up with, its almost impossible. There is a benefit to having a standard, even if its not optimal.
KeithB
@KeithB: Funny. My big 3-year project I mentioned was also scientific computing on clusters. SCons would never have been a problem.Again - I take your point. You're right that there is *some* benefit to Make being the fallback that everyone knows and is familiar with. It just sucks.
Ben Collins
+1  A: 
  • Where should I create the object(.o) files? All object files in a single directory or each object files in it's own modules directory? I mean which is the best place to generate foo.o? Is it in foo directory or the root (My example generates in the root)?

I find it easier for investigating failed builds to localize object files in a separate directory under the module level directory.

foo
    |_ build
    |_ src

Depending on the size of the project, these object files are grouped to form a component at a higher level and so on. All components go to a main build directory which is where the main application can be run from (has all dependent libraries etc).

  • In the provided example, g++ -c foo/foo.cpp command generates the .o file in the root directory. But when linking(g++ -o fooexe foo/foo.o main.o) it is looking for the foo/foo.o. How can I correct this?

Use:

 g++ -o fooexe  foo.o main.o
dirkgently
Thanks for the reply. I know "g++ -o fooexe foo.o main.o", but my question was how can I correct this in makefile, so that it shows like "g++ -o fooexe foo.o main.o". Any idea?
Appu
You'd have to change the 'OBJ := $(patsubst %.cpp, %.o, ...' line to replace the path by nothing since all object files are created at the same level as main.o. HTH.
dirkgently
A: 

+1 for SCons.
I am using SCons, too. It scans the dependencies for you and it only rebuilds when source has changed as it uses cryptographic hash sums instead of timestamps.

In my SCons build the objects live in parallel directories to the source (to enable multiple builds like combinations of 32bit and 64bit, release and debug):

src
.build
  linux
    i686
      debug
      release
    x86_64
      debug
      release
lothar
A: 

With regards to object and other generated interim files, I put these in a directory completely separate from the sources (I.e. under a directory that is excluded from backup and revision control). It may be slightly more bother to setup in projects or makefiles, but it saves time packaging up sources, and it is easier to have clean backups and revision control.

I create a subdirectory structure for the object files that matches the subdirectory structure for sources. Typically I have a separate subdirectory for each of my libraries and programs.

Additionally I also use multiple compilers (and versions) and multiple operating systems, so I will reproduce the object file directory structure under a directory for each of these compilers (which have newer versions of the standard and vendor libraries) to prevent object files with mismatched included header file versions.

Roger Nelson