views:

48

answers:

1

At our company we have a very nice system setup for tracking our external packages dependencies in revision control for our current desktop application development (C++/python). We are starting to develop some python only web applications are are looking for recommendations for best practices when there is only python code involved and it comes from easy_install'able packages.

For our desktop apps we have something like this:

app_svn_root
  - trunk
    - src
    - doc
    - deps -> [svn:external to deps repos with rev num set]

deps_svn_root
  - trunk
    - setup_env.sh/bat  [generated automatically
    - dep_project_1 [example: boost, libxml, python, etc]
       - vendor_base [svn:external to vendor branch or project repository]
       - install_linux_gcc43
         - bin
         - include
         - lib
       - install_linux_win32_vc90
         - ...   [whatever directory structure the project build creates]

When any developer on the team checks out the code for the application, they automatically get all the dependencies and get them with the correct version for that revision of the code. [note: there are several internal management scripts, etc that I left out, but this is the general idea]. This works great for us. No developer needs to worry any more about setting up their personal machine with every package with the correct version, it allow multiple development copies to be checked out at a time (ex: version 1.0, 1.1, 2.0, etc), and it allows the continuous integration system to package the dependencies and run unit tests with the correct version of the deps.

Now we are starting to work on some google app engine based python projects and we want something like the above. We want to keep the ability to have a developer check out everything they need at once and guarantee that everyone is using the same dependencies. We could keep using this exact structure, but it seems to heavy weight for a pure python project.

Originally I was thinking of something like this:

- trunk
  - gae_apps
    - gae_sdk  [svn:external to the latest stable GAE code]
    - deps
      - nose
      - nosegae
      - pylint
    - app1
      - templates
      - tests
      - deps
        - webapp2
        - console

The problem I am running into is that all the python projects I want to use (nose, nosegae, etc) recommend using easy_install to download and install them. This installs them into the main system directory though. What I really want is to have the code installed into a specific directory for each package. (note: I was planning to put some code in main.py that would add all the packages in deps to sys.path correctly). Is there a way to do this? Is it even a good idea?

What are the best practices out there for tracking dependencies in pure python apps to support development with large teams?

+1  A: 

You can use virtualenv + pip. With virtualenv you can create customized environments for your application, where all packages are installed locally with pip (an easy_install replacement) in the environment, not system-wide. Then you can generate a list of dependencies for them to be installed automatically.

Roberto Bonvallet
I thought about doing that, but I didn't see a way to get the dependencies installed into the revision control directory. In effect, what I want is a virtualenv that is pulled from a shared directory (or directories) within the project's checked out code. (in other words, I don't want the developers to have to install anything on their local machines, just check out the repository and go)
Allen
After looking into this more, I tried to use `pip -E .../deps nose` to install the packages. I had seen some references online that said this should work, but it didn't. I think pip has to target a standard or virtualenv and can't simply target a directory where I want to put packages. I am still searching around to find a solution.
Allen