views:

1139

answers:

3

Hi All,

Just curious how people are deploying their Django projects in combination with virtualenv

  • More specifically, how do you keep your production virtualenv's synched correctly with your development machine?

I use git for scm but I don't have my virtualenv inside the git repo - should I, or is it best to use the pip freeze and then re-create the environment on the server using the freeze output? (If you do this, could you please describe the steps - I am finding very little good documentation on the unfreezing process - is something like pip install -r freeze_output.txt possible?)

+2  A: 

I use this bootstrap.py: http://github.com/thraxil/ccnmtldjango/blob/master/ccnmtldjango/template/bootstrap.py

which expects are directory called 'requirements' that looks something like this: http://github.com/thraxil/ccnmtldjango/tree/master/ccnmtldjango/template/requirements/

There's an apps.txt, a libs.txt (which apps.txt includes--I just like to keep django apps seperate from other python modules) and a src directory which contains the actual tarballs.

When ./bootstrap.py is run, it creates the virtualenv (wiping a previous one if it exists) and installs everything from requirements/apps.txt into it. I do not ever install anything into the virtualenv otherwise. If I want to include a new library, I put the tarball into requirements/src/, add a line to one of the textfiles and re-run ./bootstrap.py.

bootstrap.py and requirements get checked into version control (also a copy of pip.py so I don't even have to have that installed system-wide anywhere). The virtualenv itself isn't. The scripts that I have that push out to production run ./bootstrap.py on the production server each time I push. (bootstrap.py also goes to some lengths to ensure that it's sticking to Python 2.5 since that's what we have on the production servers (Ubuntu Hardy) and my dev machine (Ubuntu Karmic) defaults to Python 2.6 if you're not careful)

thraxil
+1  A: 

I just set something like this up at work using pip, Fabric and git. The flow is basically like this, and borrows heavily from this script:

  1. In our source tree, we maintain a requirements.txt file. We'll maintain this manually.
  2. When we do a new release, the Fabric script create an archive based on whatever treeish we pass it.
  3. Fabric will unpack the arvhive on the remote host(s) and look at the requirements file to see the last time it changed: git log -1 --format=format:%h requirements.txt. This spits out the short version of the hash, like 1d02afc.
  4. The Fabric script will then look into a directory where our virtualenvs are stored on the remote host(s).
    1. If there is not a directory named 1d02afc, a new virtualenv is created and setup with pip install -E /path/to/venv/1d02afc -r /path/to/requirements.txt
    2. If there is an existing path/to/venv/1d02afc, nothing is done

The little magic part of this is passing whatever tree-ish you want to git, and having it do the packaging (from Fabric). By using git archive my-branch, git archive 1d02afc or whatever else, I'm guaranteed to get the right packages installed on my remote machines.

I went this route since I really didn't want to have extra virtuenvs floating around if the packages hadn't changed between release. I also don't like the idea of having the actual packages I depend on in my own source tree.

brianz
A: 

To build my Django/PIP/Virtualenv environment, i relied on an excellent step-by-step tutorial in a blog post written by a small shop named "Bread and Pepper."

doug