views:

53

answers:

2

Suppose that I am writing two applications for my employer, we'll call them App1 and App2. These applications depend on some packages containing code needed by both. Let's say that App1 depends on PackageA and PackageB. App2 depends on PackageB and PackageC. The organizing strategy that seems natural to me would be to check everything into version control like this:

repo_root
+--- App1
|    +--- App1.py
|    +--- ... and so on
+--- App2
|    +--- ... files for App2
+--- PackageA
|    +--- __init__.py
|    +--- ... and more files
+--- PackageB
|    +--- ... files for PackageB
+--- PackageC
     +--- ... files for PackageC

The problem comes with importing the packages. For example, App1 and App2 both need to import PackageB, but I can't just put "import PackageB" into the main file for each of these applications. Python doesn't search the parent directory for packages to import.

I know a couple of options to do this, but they both seem a little ugly. One strategy that I've used before is to put the main file for App1 and App2 into the "repo_root" directory. Then the two main files can import the packages without any problems. Another option is to use sys.path.append and file to figure out what the parent directory is and add it to the path that Python searches for modules.

Is there a clean, elegant way to do something like this? Thanks for your help.

Update: While the virtualenv solution can help a great deal when it comes to dealing with packages and dependencies, it almost seems like overkill for a problem that could be solved by a relative import. Carrying out a relative import seems to be fiendishly complicated, however. There is PEP 366, but that is quite complicated and probably wouldn't allow importing outside of a package anyway. I spent some time looking at importlib, but I'm pretty sure that doesn't allow importing outside of a package either. Many people seem to use munging of the sys.path, of which this seems to be the best example that I've found. But, as I mentioned, this seems a rather hackish way to do things. I've spent nearly all day on investigating this, and I don't think that there is an answer. Correct me if I'm wrong, but I now believe there is no clean, non-hackish way to do a relative import without bringing in a heavy-hitter like virtualenv and some .pth files. Anyway, thanks again for your help. I'll mark this as answered since virtualenv is the only option.

+1  A: 

virtualenv is a clean and elegant way to deal with such issues. Read the primer, then the summary on pypi, then install it and give it a try!

Alex Martelli
+2  A: 

One solution you can use for this is to have a virtualenv for each of your apps, and then use a relative .pth file to point to the Packages. This gives you fine control over the environment each of the apps is being developed in and avoids the "but I've got package_x on my machine!" problems in testing.

mavnn
I've been reading and watching some screencasts, and I think I'm getting the idea. I can create a virtualenv bootstrap script for App1 and App2 that would install the needed open source libraries as well as the '.pth' file (which has to be placed in a 'site-packages' sort of folder). After checkout, runnning the bootstrap would create the environment that allows the app to run. For deployment, I could use the same bootstrap to set up the production server? Is this about correct?
bnsmith
Pretty much. The big bonus of developing in a virtualenv is knowing exactly what the deployment requirements are, and deploying in a virtualenv is definetely an option to be considered.
mavnn