views:

554

answers:

2

I have a bunch of C files that are generated by a collection of python programs that have a number of shared python modules and I need to account for this in my make system.

It is easy enough to enumerate which python program need to be run to generate each C file. What I can't find a good solution for is determining which other python files those programs depend on. I need this so make will know what needs regenerating if one of the shared python files changes.

Is there a good system for producing make style dependency rules from a collection of python sources?

+1  A: 

The import statements are pretty much all the dependencies there are. There are are two relevant forms for the import statements:

import x, y, z
from x import a, b, c

You'll also need the PYTHONPATH and sites information that is used to build sys.path. This shows the physical locations of the modules and packages.

That's kind of painful to process, since you have to do the transitive closure of all imports in all modules you import.

As an alternative approach, you can use the -v option to get the complete list of imports and physical files. This produces a log that you can edit into a flat list of dependencies.

For instance, when I do

>>> import math

I see this in the log

dlopen("/Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/lib-dynload/math.so", 2);
import math # dynamically loaded from /Library/Frameworks/Python.framework/Versions/2.5/lib/python2.5/lib-dynload/math.so
S.Lott
+3  A: 

modulefinder can be used to get the dependency graph.

fivebells
Note that some modules (e.g. PIL) do funky things with dynamic imports, so modulefinder sometimes needs some help to locate everything.
John Fouhy