tags:

views:

454

answers:

5

I've got a number of scripts that use common definitions. How do I split them in multiple files? Furthermore, the application can not be installed in any way in my scenario; it must be possible to have an arbitrary number of versions concurrently running and it must work without superuser rights. Solutions I've come up with are:

  • Duplicate code in every script. Messy, and probably the worst scheme.
  • Put all scripts and common code in a single directory, and use from . import to load them. The downside of this approach is that I'd like to put my libraries in other directory than the applications.
  • Put common code in its own directory, write a __init__.py that imports all submodules and finally use from . import to load them. Keeps code organized, but it's a little bit of overhead to maintain __init__.py and qualify names.
  • Add the library directory to sys.path and import. I tend to this, but I'm not sure whether fiddling with sys.path is nice code.
  • Load using execfile (exec in Python 3). Combines the advantages of the previous two approaches: Only one line per module needed, and I can use a dedicated. On the other hand, this evades the python module concept and polutes the global namespace.
  • Write and install a module using distutils. This installs the library for all python scripts and needs superuser rights and impacts other applications and is hence not applicable in my case.

What is the best method?

+4  A: 

You can set the PYTHONPATH environment variable to the directory where your library files are located. This adds that path to the library search path and you can use a normal import to import them.

sth
+1  A: 

Another alternative to manually adding the path to sys.path is to use the environment variable PYTHONPATH.

Also, distutils allows you to specify a custom installation directory using

 python setup.py install --home=/my/dir

However, neither of these may be practical if you need to have multiple versions running simultaneously with the same module names. In that case you're probably best off modifying sys.path.

dF
+1  A: 

I've used the third approach (add the directories to sys.path) for more than one project, and I think it's a valid approach.

Can Berk Güder
+3  A: 

If you have multiple environments which have various combinations of dependencies, a good solution is to use virtualenv to create sandboxed Python environments, each with their own set of installed packages. Each environment will function in the same way as a system-wide Python site-packages setup, but no superuser rights are required to create local environments.

Google has plenty of info, but this looks like a pretty good starting point.

Daniel
+8  A: 

Adding to sys.path (usually using site.addsitedir) is quite common and not particularly frowned upon. Certainly you will want your common working shared stuff to be in modules somewhere convenient.

If you are using Python 2.6+ there's already a user-level modules folder you can use without having to add to sys.path or PYTHONPATH. It's ~/.local/lib/python2.6/site-packages on Unix-likes - see PEP 370 for more information.

bobince
+1: Just set PYTHONPATH environment variable. Simple, neat, portable, clean. Works everywhere.
S.Lott
@S.Lott: Works everywhere you have a shell environment. Plus, you need a start script for this, which complicates things
phihag
@phihag: you don't actually need a start script. PYTHONPATH can be set in your .bashrc or your Windows settings.
S.Lott